Forum Moderators: Robert Charlton & goodroi
Like other search engines (including Microsoft's Bing), we also use exception lists when specific algorithms inadvertently impact websites, and when we believe an exception list will significantly improve search quality. We don't keep a master list protecting certain sites from all changes to our algorithms.
The most common manual exceptions we make are for sites that get caught by SafeSearch - a tool that gives people a way to filter adult content from their results. For example, "essex.edu" was incorrectly flagged by our SafeSearch algorithms because it contains the word "sex." On the rare occasions we make manual exceptions, we go to great lengths to apply our quality standards and guidelines fairly to all websites.
[seroundtable.com...]
Like other search engines (including Microsoft's Bing), we also use exception lists when specific algorithms inadvertently impact websites, and when we believe an exception list will significantly improve search quality. We don't keep a master list protecting certain sites from all changes to our algorithms.
John said he "set a flag" so that any time the algo picked out his site for that issue in the future, it would let him pass, but trigger a human review instead.
Cutts: Yea, it is also important to realize that there are many many algorithms, maybe the majority of the algorithms that don’t have any exception logs. For example, Panda, we don't have, there is not way right now, to do any manual exceptions.
Sullivan: No, no, I read the Cult Of Mac you'd exempted things for them. [sarcasm]
Cutts: Nope
[searchengineland.com...]
The article also has a video of this part of the session. Unfortunately the visual part is just the session slide. But you do get to hear the voices and that gives other signals that a mere transcript can't convey. We also get a new piece of insider language from Google: "exception log."
Having canonical problems like that is a bit of a different thing from whitelists, isn't it?Ted is it? I am not so sure.
maybe it's related to consistent internal linking.The whole site diplayes in ip, non and www. If this were to happen to a site of mine today it would be nuked. There has to be something that prevents this site being effected. I have watched this site go through all the Google updates nothing has ever effected it. I have for along time suspected a whitelist. I might add the links were added 3-4 years ago so if it was a past problem it didn't work then either.
. For example, "essex.edu" was incorrectly flagged by our SafeSearch algorithms because it contains the word "sex."
Cutts: Yea, it is also important to realize that there are many many algorithms, maybe the majority of the algorithms that don't have any exception logs.
the majority of the algorithms that don’t have any exception logs. For example, Panda, we don't have, there is not way right now, to do any manual exceptions.
The European Commission in November started investigating whetherGoogle is illegally directing users of its search engine to websites it owns or is affiliated with and whether Google is stopping websites from accepting competitors' ads.
[edited by: frontpage at 12:41 pm (utc) on Mar 17, 2011]
I think the claim that Google results are natural based on algorithms is pretty much debunked.
Google has stated that they hand manipulate results and that includes placing their own links at the top of SERPS.
This will be further weight to the government charges of a monopoly that is abusing its position.