---- Some research on Google quality raters' behavior
1script - 4:40 am on Sep 27, 2011 (gmt 0)
Just had an interesting datapoint added to my little research, compliments of Google. So, I'll share here in case someone's still interested in the subject of Google's quality raters.
4 of my reconsideration requests got returned today, all denied ("some or all of your pages still violate our quality guidelines"). All requests were submitted at different times over the course of almost a month. All notifications are boilerplate-identical. All notifications date-stamped with the same time, i.e. they came within a minute from each other.
Of the four sites involved:
One no longer exist(!). I decided it's not worth the trouble and pulled the hosting account five days after sending the recon request. Funny how they say "still violates the guidelines". Whatever the violation was, it's gone.
One has been extensively combed over and prettied up, features added, ads reduced, speed improved.
One has been left virtually untouched (I've found an outstanding DMCA against it, removed the offending content thinking it could be the reason) but ads have been reduced.
One was untouched but all types of ads completely removed.
None of the three still working sites registers a hit from a Google network from a non-bot since September 1st which for some sites is longer than the response time. They obviously didn't see the dead site - there's nothing to see.
So, I'm drawing these conclusions:
They don't actually look at the sites (4 sites in 1 minute is not much). My best hope is that they go by some parameter prepared for them by the algo. In which case why the heck they have human reviewers?
They bunch together all requests by the same webmaster which makes me think "trustworthiness" of the webmaster weighs heavily on the decision
If anyone is willing to give them a benefit of a doubt that they actually look at anything, they completely rely on cached copy and possibly even a preview snapshot. Otherwise there would have been hits on images or they are not interested in the looks and usability but strictly content and HTML. From now on all my future websites will have <meta name='robots' content='noarchive'/> on all content pages. Otherwise you let them judge you by content of the cache that's been collected at an unknown time, very likely BEFORE you made any changes - when the site gets banned, the bot's activity goes way down and so I assume their ability to refresh the cache.
The number of ads by itself is not a factor - ads get really bad rap here lately