Welcome to WebmasterWorld Guest from

Forum Moderators: rogerd

Message Too Old, No Replies

WebHostingTalk Hacked and Offline

Worst incident in ages

4:57 pm on Mar 26, 2009 (gmt 0)

Administrator from US 

WebmasterWorld Administrator brett_tabke is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month Best Post Of The Month

joined:Sept 21, 1999
votes: 62

WebHostingTalk [webhostingtalk.com] was maliciously attacked over the weekend. WebHostingTalk is the largest online forum for discussion of Webhosting and Server related issues. WebHostingTalk is owned by iNet Interactive [inetinteractive.com]. They are also owners of HotScripts.com, and Search Marketing Standard magazine. They also own numerous other forum sites. These guys are not newbies to forum operations and have a quality tech and management system in place.

A hacker gained access to an offsite backup server and then used info on that server to walk into the main live server. The hacker deleted the backup databases, and then deleted the live site. Apparently, they also covered their tracks and over wrote the drives so that no possibility of recovery was possible. This is the most deliberate, sophisticated and calculated hack I have heard of in recent memory.

Unfortunately, the last local offline copy of the system is from late last year. So expect them to be offline for a bit, while they rebuild the db's.

This is a lesson for ALL forum operators. Our thoughts are with the WHT and iNet teams that are working on the issues.

/off doing backups to dvd disks.

Interviews from HostingCon2008
Including interview with iNet CEO Troy Augustine [searchengineworld.com]

Also, a previous thread on the topic here: [webmasterworld.com...]

[edited by: Brett_Tabke at 8:01 pm (utc) on Mar. 26, 2009]

2:55 pm on Mar 27, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Apr 23, 2003
votes: 0

A friend from Uni has a proper multi-million grossing site with no offline backup. He says he doesn't need it. What do you think?

There are people who don't wear a seatbelt when driving. Most of them are still alive... but that doesn't make me think it's a good idea

Application server is only point-of-exposure to the internet. It has no authority to do anything to any other bit of hardware. Configuation cannot be done over public connection. Watchguard or similar is DNS target, forwarding HTTP or HTTPS traffic to App server, bloking all other ports and otherwise protecting against malware and attack. SQL-injection is prevented. Application server actually has mutiple copies offline for development purposes. App server has access to the DB server. DB server can ONLY be accessed from App server, plus managment server, not from internet. Management server can only be accessed over VPN. VPN can only be established from pre-determined IPs and terminates on watchguard, then a seperate SSL VPN needs to be established through that to management server. (Man server runs the backend stuff, including CRM). App server is on a separate VLAN to Man server.

All of that sounds lovely... so I'll raise you - one rogue employee in the data center with physical access to the servers. This is listed as rule three [technet.microsoft.com] - and once you've read that list, this [technet.microsoft.com] is also worth a read.

As far as I'm concerned, you can come up with a way to attack anything - the question is, is it worth it?

If your friend feels his planning is bulletproof, that's nice. If I was a big investor in his business I'd be asking some pretty pointed questions...

3:55 pm on Mar 27, 2009 (gmt 0)

Full Member

5+ Year Member

joined:Dec 2, 2008
votes: 0

Cron backup outside the webroot, downloaded locally every week and burned to hard media once a month. It's not perfect, but I'm not a bank. I was reminded yesterday as I watched a lady struggle to load a till with a new printer ribbon, that new technology brings new problems now matter what the promise.
4:32 pm on Mar 27, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Dec 20, 2004
votes: 0

I like to print my pages out at the end of every day, and then file them in my attic (to protect them from flooding in the basement). I use photo-paper since it seems more durable... longer lasting backups.

(joking... ;-)

There is NO fool-proof server lock down and/or backup technique. As others have stated, even if you shut down every possible digital door, there is still a lot of physical and social doors open.

If someone is determined enough, they will always find a way. Which seemed to be the case here.

5:04 pm on Mar 27, 2009 (gmt 0)

Preferred Member

10+ Year Member

joined:Feb 18, 2003
votes: 0

I wonder what % of usernames/e-mails and password combos they have access to would work at major financial websites?

I've been a member there since 2001 with the same username as here and I'm pretty sure the same password. But none of my financial or e-mail accounts use this same username or password combo.

4:20 pm on Mar 28, 2009 (gmt 0)

Junior Member

10+ Year Member

joined:June 18, 2003
votes: 0

Another important thing to think about is checking your backups to make sure they are actually functioning and backing up what you intend to back up.
9:46 pm on Mar 28, 2009 (gmt 0)

Junior Member

10+ Year Member

joined:Apr 28, 2004
votes: 2

"When, not if" seems to be the most useful mentality for approaching data loss. Or really, any type of loss.

Agreed on using rsnapshot. Also, rdiff-backup might be useful for some. If you aren't cronning a part of your backup procedure, a one-word alias for your backup script helps to combat laziness.

Never fully trust the backup policy of your webhost or datacenter. Routinely check on them by verifying the backups they make for you, and always maintain your own independent solution for keeping local backups. Backups on geodistant servers help me sleep a little better too.

External media, hard drives, databases, software, employees -- they all fail and/or corrupt regularly.

RAID is more of an availability solution than a backup solution.

Anyone have details on how to verify the integrity of an SQL dump file, or a master-slave setup -- beyond just checking/repairing the tables? If you have a forum with millions of posts, how do you "check" your backups for corruption, or nefarious alteration? It's often impractical to backup a new 10 gig file every day. Do you just keep daily diffs, and then backup fully once per month or so? What do you use for verifying database integrity?

7:40 am on Mar 29, 2009 (gmt 0)

Full Member

10+ Year Member

joined:Dec 7, 2000
posts: 267
votes: 0

Wow this does sound scary. Even though my site is set for daily backup, seems like it might be better idea to have another setup for daily backup from remote to local computer.

It would take tons of time to get site back up if everything is wiped out.

I feel sorry for WebHostingTalk, however do know that they have capable people to fix the issue and will be able to prevent such mishaps in future.

10:26 am on Mar 29, 2009 (gmt 0)

Senior Member

joined:July 29, 2007
votes: 100

it is pretty clear that only those working with the backups actually even knew what the server address was - let alone how someone got into it.

Anyone who had a link in any article could find the backup server the moment someone clicked on a link from the backup site, be it bot/search engine/or other.

Anyone who had an image in any article could find the backup server the moment the image was loaded.

Having a backup database is great but if you run an actual backup site, meaning the whole site is live even if hidden/protected/robots.txt blocked, you've got the potential to be found.

Did they run a backup site or just maintain a backup copy of code without having a protected (but live) environment?

5:54 pm on Mar 29, 2009 (gmt 0)

Preferred Member

10+ Year Member

joined:Dec 1, 2003
votes: 0

Disgusting....hope there's better luck to them and worse luck for the perp.
7:19 pm on Mar 29, 2009 (gmt 0)

Administrator from US 

WebmasterWorld Administrator incredibill is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Jan 25, 2005
votes: 99

Having a backup database is great but if you run an actual backup site, meaning the whole site is live even if hidden/protected/robots.txt blocked, you've got the potential to be found.

Which is why a backup server, after initially being setup, should be firewalled off and be a black hole, nothing running, no pages being loaded, and only SSH access in and out until being called into service.

However, odds are whatever weakness is in your primary server has been cloned into the backup as well so if you get hacked once the backup is usually a sitting duck once it goes live.

7:24 am on Mar 30, 2009 (gmt 0)

Full Member

10+ Year Member

joined:Oct 22, 2002
votes: 0

Funny, I was reprimanded and ridiculed for my real time offsite backup setups, with daily hardcopy dumps, 3 separate devices onsite to do backups another way AND I used different types of systems to tie them in together so a new hack on one wouldn't work on the next server. Same deal for new fault introduced by day to day updates and revs...

So when I left the new guy got rid of all the documented processes except some basic server backups, last I heard he was restoring my last hardcopy backup (almost a year old) to recover data missing due to power, equipment failure and user error.

Even in the new trend of supposedly easy backups complacency will bankrupt you financially and emotionally when you can least afford either. You must pay homage to the zen of backup or it will stick you good.

4:14 am on Mar 31, 2009 (gmt 0)

Full Member

10+ Year Member

joined:Apr 28, 2005
votes: 0

Could the attackers done the same with Slashdot, Sourceforge and Freshmeat yesterday /today, we'll know when they'll come back live again, see [webmasterworld.com...]
10:58 pm on Mar 31, 2009 (gmt 0)

Preferred Member

10+ Year Member

joined:Apr 16, 2004
votes: 0

regarding passwords:

they used vbulletin which uses a salted md5 hash.

to get any information they'll have to construct a rainbow table using that particular salt, if I'm not mistaken.

in other words, only "weak" passwords (probably dictionary, or simple variations of them) will likely be available to them.

if you were using a strong password, the chances of them finding it are slim to nil.

11:42 pm on Apr 7, 2009 (gmt 0)

New User

10+ Year Member

joined:Feb 14, 2008
posts: 34
votes: 0

It just got worse news over there. Really feel for Troy and his team. Hacker scum and their shadow masters wanting to punish!

UPDATE: 7:14pm est 04/07/09
From what we know now, there were more records on the database server where the credit card dump was taken. If research shows that a larger number of customer's data was compromised, we will contact those individuals directly.

UPDATE: 4:24pm est 04/07/09

We have contacted all major credit card companies and are awaiting their guidance. It should be noted that card holders will not be held liable for any fraudulent purchase made using their credit card.

ANNOUNCEMENT - 1:25pm est 04/07/09

This morning, the hacker who attacked WHT initiated further communication. He provided evidence that credit card information on one of our database servers was, in fact, compromised during that attack.

What is WHT and iNET Interactive doing about it?
If we have evidence or suspicion that your credit card information was leaked, you will be receiving further communication from WHT and iNET Interactive.

Why is WHT down and when do we expect it to be back up?
We're currently doing a full security sweep of our cluster to ensure the servers are secure. The site will be back up once this security review is complete.

7:05 am on Apr 8, 2009 (gmt 0)

New User

10+ Year Member

joined:Feb 14, 2008
posts: 34
votes: 0

Update -- re: the credit card breach I posted a specific thread about this topic here [webmasterworld.com].
This 45 message thread spans 2 pages: 45