Forum Moderators: Robert Charlton & goodroi
Q: How do they tell if they have bad results?
A: ...they have 10,000 human evaluators who are always manually checking the relevance of various results.Article by attendee Dare Obasanjo [25hoursaday.com]
Next time we're trying to figure out some odd change in the SERPs, we might do well do remember this human factor, eh?
There are patents for schemes which explicitly use user feedback to adjust ratings, and most of those patents are from Neelakantan Sundaresan at IBM Almaden Research, not from Google.
The fundamental trouble with using user feedback to drive search is that it's too easy to spam. Unless the number of raters per thing rated is high, as for movies, or you have to do an actual transaction with real money to submit a rating, as with eBay, most ratings will come from involved parties. Which does not help.
They give multiple evaluators the same work, so that they get a consensus answer. Very basic "is this page spam or not" work. The purpose is to check examples of spam, not all spam. 10,000 evaluators would imply that they are making dozens of tweaks to the algorithm every day, and have 500 staff doing the tweaks... Google's search algorithm is too important to have that many people working on it at once.
I'd guess the real figures to be more like 20 working on the algorithm, and 100-300 evaluators.
The problem with this is a bunch of people will blame any problems they have to human evaluators.
I don't think Google would regard that as a problem. They must be pretty thick-skinned by now. :-)
The bottom line is that 90% of the SEO tricks that we all learned in 1999 are worthless today. You cannot fool Google into thinking that your site deserves to be #1 in the results through some kind of trickery. Maybe you can temporarily, but in the end, you better have a site that looks good and works good and that people like. If you don't, all the meta-tag tweaking in the world isn't going to make a difference.
For years my company has been designing sites, and after a year or two online, the customer comes back to us saying "This is incredible! Google LOVES us!" Now we know why. Because the sites we design are good, they look good, they work good, and when the human evaluator comes to look at the site for even a minute or two, they give it the thumbs up.
It's been so obvious that they do this, just by watching Google's behavior. People said I was crazy, they said that there are too many websites out there to do this, they said that Google has publically stated that they seek to completely automate their process. But I don't care what anybody says--I care about what their software does. And their software has been screaming of human interference loud and clear ever since the day they went online.
That's why I feel completely justified in screaming "HAH! I TOLD YOU SO!" at the top of my lungs right now.
Cheers,
Bolotomus
google.com/evaluation/search/rating/task-edit?task=
Get the IP, and find all hits from that IP in your logs:
Here is what I am seeing from the same (non Google) IP:
- Page1 is requested with a direct hit, no referrer (He is browsing the web)
- 43 min. later same page is hit giving the evaluation as referrer (he decided to evaluate my page)
- 30 seconds later the same page is hit, no referrer (He hit an external link on the page and then the back button)
- 2.5 Hours later same page is hit giving the evaluation as referrer (I have no idea why)
- He bookmarks the page (Plans to visit again)
And this is a very minor trivial page on my site, the same IP does not surf to other links on the page, does not try to find out more about my site or what it does, just evaluating the page independently from anything else, I have also looked up everyone else that requested the page since and it is not visited by any Google IP or another evaluation referrer.
[edited by: Hobbs at 3:33 pm (utc) on July 12, 2007]
I also detected the same behavior in other sessions with the evaluation referrer.
Those poor evaluators are paid only $10 - $15 /hr. That doesn't buy much search quality evaluation. Google gets what its paying for :-)
How difficult can it be to tell #*$! from Shinola?
I read once that most of the evaluators/raters are young students, but that could be incorrect of course.
$10-15 per hour for rating Web sites isn't a bad job if you're a student. It beats serving up grub in the university cafeteria or unpacking cartons in the bookstore.
Also, it isn't hard to tell #*$! from Shinola if you're intelligent and have been given guidelines. My 20-year-old son (who's a university student) would be an excellent candidate for the job.