What we know: 1) Human raters work as a control group for the Google engineers (confirmed by Matt + leaked manual documents). They use the data to improve the algo. But Google says the raters are not used to directly change the rankings. 2) There is also a confirmed webspam team punishing bad boys and giving whitelistings (on algo level) for specific sites (also confirmed).
Question: Is Google also using human raters to directly change the rankings of specific sites in a positive way or is this done only through the algo?
I don't think the average rater can perform manual actions on individual websites. But they may be able to refer a site to a supervisor or higher level person who does have the authority to perform manual actions.
Msg#: 4497446 posted 11:11 am on Sep 20, 2012 (gmt 0)
They use the data to improve the algo. But Google says the raters are not used to directly change the rankings.
Matt Cutts talked about this area informally at Pubcon Vegas last year - during an intense and extended discussion during the evening networking event. He described Google's use of human raters as algorithm testers - just as Sabrina described and He explained why only that use of human raters made sense for the way they build search rankings.
That's the only "confirmation" that I'm aware of, and it was definitely not a formal one, but it was very public. You know the way that Matt can get surrounded by people at conferences. I'm sure at least 20 people were in that group discussion, maybe 30.
Is Google also using human raters to directly change the rankings of specific sites in a positive way or is this done only through the algo?
Only Google knows for sure but...
- Several raters must rate a page before the rating is accepted - Something algorithmic determines when a page is up for review
So from that limited information I would think that pages come up for review when the algo decides a page *might* be worthy of page one results. If the raters think so too welcome to the club. If the raters don't think the page is prime time material, and you accumulate many of these unworthy pages, then the entire site might struggle.
That's a simplified version of how I would do it anyway. Keeping the number of low quality pages to an absolute minimum has always served me well and I use googlebot noindex meta tags quite liberally on my own sites. I suspect it's also how Google trims the spam out of their index after an algo update since some always seems to float for a day or two afterwards.
tip: keep an eye on which pages google does not send traffic to. If a page hasn't received traffic from Google in some time it's safe to remove it from the index with a googlebot noindex tag until you improve it or delete it.
They can tweak the algo anyway they want in order to get a desired result..
To some degree, sure - but I think you're writing off what it takes way too easily. Yes, they could tweak the algorithm so that one particular search comes up exactly the way they want. But that approach would have side effect that will trash another search. And with high percent of all searches every day having been never seen before...
I once worked on an internal search algorithm for a large corporate knowledge base. That opened my eyes to the challenges involved. The company had control (to a degree at least) of hundreds of millions of internal documents and their meta tags.
And yet getting one search algorithm to produce acceptable results for any user search - even just the more common ones - proved to be challenge the team did not meet very well in a two year effort.
How much more challenging is the database of all web documents, authored with all manner of non-standardized approaches.