| 7:37 pm on Nov 28, 2008 (gmt 0)|
I don't think anyone who is not a Google employee can tell us the exact method that a human decision gets fit back into the ranking data. I can only offer something pretty generic - there seems to be a set of extra fields or tags/flags that are related to a given domain or even to a single page. When they are set to positive, then these fields cause the initial or "raw" relevance numbers to be lowered, usually by a minus amount or by a fractional multiplier.
The manual penalties are probably not set in a techncially different manner from algorithmic penalties. It's just that some manual records might need a human action to be placed or removed.
There's a description of such mechanisms in the Google phrase based indexing patents [webmasterworld.com] - especially the one about detecting spam. Another relevant patent that offers some insight is the human editorial input patent [webmasterworld.com].
| 7:27 am on Nov 29, 2008 (gmt 0)|
Remember a thread in which large amounts of Indian reviewers had been mentioned. Any news on that?
| 8:05 am on Nov 29, 2008 (gmt 0)|
The human editorial reviewers are still in place - in something like 40 countries I've heard, including the US. That's what the second thread I linked to above is all about.
As I read it those reviewers do not actually place heavy duty -30 or -950 penalties. They seem mostly to be used for Quality Checking various SERPs and giving a human look at how well the algo is doing.
From my reading, their effect depends on a consensus of several indpendent reviews, and is more likely to be a tweak up or down rather than a huge penalty. Of course if they see something really out-of-line, I'm sure they have a whistle they can blow to get some full time engineer's attention.
| 2:12 pm on Nov 29, 2008 (gmt 0)|
thanks for the info ted.
| 3:00 pm on Nov 29, 2008 (gmt 0)|
Much talk about manual/human penalties is likely unsubstantiated/unjustified speculation. The Web is too big for humans to police. The exception may be spam reports submitted to Google but that's by humans. I doubt there's a Google Gestapo looking for sites to smack!
| 11:26 pm on Nov 29, 2008 (gmt 0)|
"I keep hearing about manual penalties by google i.e -30, 950"
A 950 penalty will never be manual. I don't know much about -30 ones but I can't imagine they would be manual either.
| 11:39 pm on Nov 29, 2008 (gmt 0)|
The old -30 penalty [webmasterworld.com] did seem to be manual - but we don't see it around these days. That old -30 penalty made it so no search at all could return the domain higher than #31, not even a search on "example.com". It was not a common penalty and seemed to be a particular smack on behind for certain sites practicing some very manipulative linking techniques.
Today, if you drop 30 points, it is more likely to be just for that search and it could well be automated. Google certainly prefers the algo over manual and over the years they've moved many penalties from being manula to automated, as well as recovery from those penalties.
| 12:58 am on Nov 30, 2008 (gmt 0)|
when you fix what you suspect is the problem with your site - does the penalty automatically get lifted? or we always need to do a manual reconsideration request?
| 4:08 am on Nov 30, 2008 (gmt 0)|
Many penalties do get lifted automatically, but some do not. A reconsideration request is a good idea, no matter whether you suspect a manual or automaic penalty. Remember that a reconsideration request will bring about a human inspection, and they may notice things about your site that were not part of the original penalty.
| 11:26 pm on Dec 2, 2008 (gmt 0)|
|The Web is too big for humans to police. |
The human reviewers only need to see sites that rank high, namely, sites that have snuck through the algorithms. I would expect Google to know niches that suffer from abuse and ask the reviewers to evaluate those sites first.
I'd also expect a methodology to evaluate the SEO-clueless but otherwise valuable sites, so that they can be given a boost.
| 4:22 am on Dec 3, 2008 (gmt 0)|
They pick a keyword, domain name, page uri, all of the above... whatever their fancy and they say "that no longer exists for you buddy, buh bye"
All incoming links that had the keyword they filtered on you suddenly vanish... which leads to a massive fall in most cases but they can make mini falls for specific portions of your site too. They can also manually adjust any site up or down. I think one favorite blanket ban that is more widely used right now is the "can't rank anything on page one" button. It's a big stick reserved for link dealers mostly.
I wonder how many older sites are wasting tons of money trying to fix issues they see exist without knowing it's not possible thanks to such a micro-ban. I'd LOVE more transparency on this but that's not possible, if they told you why they downgraded your site people would evaluate and bypass Google's measures.
Edit: Many Automatic bans are triggered only when a site reaches the upper rank in serps. It's expected for new and under construction sites to have weaknesses and it's also expected that those be gone before they rank top 10. Many automatic bans are also only semi-automatic as it takes a human to agree with what the system has flagged for review.
[edited by: JS_Harris at 4:24 am (utc) on Dec. 3, 2008]