Forum Moderators: Robert Charlton & goodroi
we do sometimes take manual action to deal with problems like malware and copyright infringement. Like other search engines (including Microsoft's Bing), we also use exception lists when specific algorithms inadvertently impact websites, and when we believe an exception list will significantly improve search quality.
Search engines do not run a ranking contest for sites, checking all entrants to make sure the contest rules are being followed. That's just not the Google mindset at all.
But if you read my post I try to get into the mind of the programmers.
Parthasarathy:
...
But the interesting part is that any time we have these static overrides, we will make sure that the next iteration of the algorithm actually handles these lists. So these lists are constantly evolving.
It is not like ABC.com or any specificsite.com is always be on the whitelist or always going to be on the blacklist. It just like evolves basically. Because we do have a manual slash algorithmic approach to that.
Cutts: Yea [agreeing with the preceding statement], it is also important to realize that there are many many algorithms, maybe the majority of the algorithms that don’t have any exception logs. For example, Panda, we don’t have, there is not way right now, to do any manual exceptions.
Like other search engines (including Microsoft's Bing), we also use exception lists when specific algorithms inadvertently impact websites, and when we believe an exception list will significantly improve search quality. We don't keep a master list protecting certain sites from all changes to our algorithms.
[edited by: TheMadScientist at 7:28 am (utc) on Mar 18, 2011]
The one case I know of explicitly (noted by tedster) was for a site owner who's site was 'filtered' for having too much white space at the top of the page.
[edited by: TheMadScientist at 7:32 am (utc) on Mar 18, 2011]
[edited by: Dan01 at 7:56 am (utc) on Mar 18, 2011]
But lately, content has been my focus.