Welcome to WebmasterWorld Guest from 107.22.61.174

Forum Moderators: Robert Charlton & andy langton & goodroi

Message Too Old, No Replies

What qualifies a search term/website for human EWOQ review?

     
7:21 pm on Oct 20, 2011 (gmt 0)

Preferred Member

10+ Year Member

joined:June 24, 2005
posts: 446
votes: 0


As many of you know, "2011 Google Quality Raters Handbook" was recently leaked accidentally to the public. In there, is a detailed guide for EWOQ users (paid by google reviewers) on how to evaluate and rank/classify websites.

Curious stuff... Most interesting to me was their 'vital' rank, which I suspect has been incorrectly applied to a competitor to mine (they're one of these private institutions that pretends to be public).

Does anybody have any idea what qualifies a search term and website combo for review by EWOQ? How many they do? Do they grab say the top 20 websites in the SERP for all say...search terms over 50k searches a month and let the EWOQ team classify these sites? Any ideas?
11:44 am on Oct 23, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:June 6, 2006
posts:1191
votes: 41


To answer the original question Mr Smith, I personally would estimate it to be the first 5 websites for terms from around 5k per month.


I have had several sites subjected to these reviews. It has always happened when they broke into the top three, for 'good money' terms. My hypothesis is that it is the value of the search term that matters, not the number of searches.
1:29 pm on Oct 23, 2011 (gmt 0)

Junior Member

5+ Year Member

joined:Apr 11, 2011
posts: 97
votes: 0


Has Google created a new defination of Spam in this set of guidelines?
6:52 pm on Oct 23, 2011 (gmt 0)

Preferred Member from TH 

10+ Year Member

joined:Mar 4, 2003
posts:422
votes: 0


In 2010 Google ran 20,000 experiments and got final 500 implementation changes. I think evaluation from google quality raters is just one of their huge list of signals. These evaluation also use in machine learning process that they see the pattern of quality contents.
10:51 pm on Oct 23, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


Has Google created a new defination of Spam in this set of guidelines?

It's pretty hard to create a "new" definition of a word that's as fuzzy in meaning as "spam" is. But they definitely have focused and refined what they mean by spam... and there seems to be at least two aspects: both the methods used to rank the pages and the content itself.
12:07 am on Oct 24, 2011 (gmt 0)

Preferred Member from US 

5+ Year Member

joined:June 14, 2010
posts: 602
votes: 3


I just finished reading a very interesting guide from end to end, for the second time. It appears to me that you end up being queued for a review based on something similar to the manual spam reports. At least that's what I get from it.

- Sites seem to be triggered or distributed based on a keyword, and the ranking of the website in need of a review. (Very similar to what a person submits in a spam report)

- Keyword stuffing, whoa?! They put the ball completely into the raters hands by saying something to the effect of: We ask you to assign a Spam flag if you think the number of keywords on the page is excessive and would annoy or distract the real user.

- Spam ratings (as defined by the document) are separate from quality ratings. It states that pages that are thin content or low quality should be marked as such: low quality, slightly relevant, off-topic or useless. IN ADDITION TO the Spam flag.

- Raters seem to be segmented into specialties or locales, or some other metrics, etc. So I get the feeling its not just a stay at home moms reviewing our stuff.

- Ads, specifically PPC text ads or "Sponsored Links" are being presented very negatively. Yes, they state that not all PPC text ads are bad, but that's like saying... "Red Widgets are known to cause cancer when you look at them... but not all are designed to cause cancer". Any sane person would simply assume its best to assume ALL.

- To be fair, it appears that it takes more than 1 person to list your page as spam. 1 person suggests that its spam, along with a brief and pointed reason, then 2 or more others in your node or team or whatever they call it, also review it.

I wish someone (Like the NY Times, or another mainstream rag) would take this entire document public. It has a "Proprietary and Confidential - Copyright 2011" statement in the footer. Surprisingly, Google itself indexes more than 4900 other pages (link [google.com]) with that same statement. Why are those documents OK for the Google index, but its own "Proprietary and Confidential - Copyright 2011" document deserves to be excluded?

Finally, if our web pages are being assessed by what I assume are our peers, why are we not made aware of the findings in either WMT or some other method? Good, bad or otherwise? If nothing more, it would bee nice if we had the ability to "request a rater review", so we had a better assessment of what Google expects.
2:05 am on Oct 24, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


What qualifies any website for review is, apparently, showing up on a top page for one of the higher volume searches. Websites are evaluated in the context of a query, not just "on their own". So wouldn't any feedback would need to be on a query-by-query basis?
9:05 am on Oct 24, 2011 (gmt 0)

Junior Member

5+ Year Member

joined:May 1, 2008
posts:53
votes: 0


I wish someone (Like the NY Times, or another mainstream rag) would take this entire document public

This is what is needed as at the moment it is just a few disgruntled webmasters shouting.

As for what qualifies a search term and website combo for review by EWOQ. I think it is a lot less than 50k, I have sites that performed OK building searches up nicely and one by one they have tumbled when they are reaching around 500 searches daily.
2:13 am on Nov 3, 2011 (gmt 0)

Moderator from AU 

WebmasterWorld Administrator anallawalla is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Mar 3, 2003
posts:3745
votes: 14


Many of these reviewers don't seem very technical and frequently get fired after poor performance reviews.


Here's a current job ad for those reviewers (US applicants only):

Leapforce is looking for highly educated individuals for an exciting work from home opportunity. Applicants must be self motivated and internet savvy. This is an opportunity to evaluate and improve search engine results for one of the world's largest internet search engine companies.


[leapforceathome.com ]

Chance for WW members to do a better job?
2:31 am on Nov 3, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


Thanks, anallawalla. The bottom of that page has an embedded YouTube video that discloses at least some of the process Google uses to determine and evaluate changes - including the human team evaluations.

How Google makes improvements to its search algorithm [youtube.com]
4:32 am on Nov 3, 2011 (gmt 0)

Full Member

5+ Year Member

joined:May 30, 2009
posts:233
votes: 7


What qualifies any website for review is, apparently, showing up on a top page for one of the higher volume searches. Websites are evaluated in the context of a query, not just "on their own". So wouldn't any feedback would need to be on a query-by-query basis?


Or could it also be based on the total number of blocked requests by the feature for logged-in users for a site as a whole?

I wish someone (Like the NY Times, or another mainstream rag) would take this entire document public


It is odd that no mention of it, as far I can tell, has made the mainstream media by now.
7:08 am on Nov 3, 2011 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:13923
votes: 496


I'm surprised nobody has said anything about the lengthy "off-topic or useless" section. In the version I downloaded it's hard to overlook, because it's all highlighted in painful yellow. Page after page of illustrative examples. So many possible wisecracks, so little time...
2:43 pm on Nov 3, 2011 (gmt 0)

Preferred Member

5+ Year Member

joined:June 10, 2011
posts: 521
votes: 0


Amazing,
It is an interesting doc.
However, after reading this doc carefully, I am sure most of you think that it seems like Google doesn't count its software to locate and track poor quality content.
Human rating must be biased.
It reminds me when I was a student working in public surveys and how I was cheating/misleading after several surveys. Even not intentionally.
4:02 pm on Nov 3, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Mar 17, 2006
posts: 838
votes: 0


discloses at least some of the process Google uses to determine and evaluate changes - including the human team evaluations.
"Discloses" is a pretty big word for a PR video. To me it sounded more like they were just trying to say that there's some "human touch" in Google and not everything is decided by a machine (algo) or a nerdy engineer.

Also, the review process ( to the extent that they touch upon it at about 1:00 minute mark in the video) is shown as a tool for fine-tuning results of a query. I understood it as the reviewers having to pick SERPs they like better rather than review the actual sites that the SERP contains.

In other words, they present the human review as a tool used for something that's completely non-controversial: fine tuning of their own product (SERPs). Talking about reviewing other people's sites in my view would not make for a great PR video because they would want to avoid the notion of their employees having a say in whether an online business lives or dies. I think from PR stand point they would rather say that "the machine did it".

Anyhow, as far as I understand the OP's intent was the punitive aspect of the Google manual review. I've yet to see Google disclose that particular aspect.

In my own experience, the sites I've had penalized or banned outright have acquired SERP sitelinks just a few weeks before the event. It has not necessarily translated into more traffic than usual, but when I see a site's sitelinks in WMT no longer empty, I start worrying. Anyone else sees that or am I just being paranoid?
4:03 pm on Nov 3, 2011 (gmt 0)

Preferred Member

10+ Year Member

joined:Apr 25, 2006
posts:475
votes: 0


It contradicts itself so frequently that it's hard to blame the raters who at best can only keep a summary of the document in their heads. A lot of:

"If you see X, the site is evil."

and then a page later:

"Not all sites with X are evil"

Which one do you think the raters will remember?
8:16 pm on Nov 18, 2011 (gmt 0)

Full Member

5+ Year Member

joined:Sept 29, 2009
posts:257
votes: 0


Ok a word of caution about this ('cause I'm not sure if it is really hm), but someone calling themselves Matt Cutts posted on PotpieGirl's blog saying the following:

[potpiegirl.com...]

Normally we do not comment on ranking methods but Iíll explain a misconception: input from manual raters is used only in the rarest of cases when a non-brand cracks the top ten for high value money terms.


She hasn't yet responded - if it is him, I'm sure she'll try to let us know.
11:33 pm on Nov 18, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


That comment sure has Matt's tone - and we know her blog is on the Google radar so I don't doubt it.
11:52 pm on Nov 18, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Mar 17, 2006
posts: 838
votes: 0


the rarest of cases when a non-brand cracks the top ten for high value money terms
So, I guess that's where we are headed: no non-brands for high value money terms in the first 10? Interesting, not even a word about relevancy for the term. Well, at least he didn't mince his words this time.
12:13 am on Nov 19, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


I never had any doubt - if an unfamiliar site starts ranking for a big money term it will get a manual check. That doesn't mean it can't rank ever, but if the manual raters question the ranking as well, then its time on top will be limited.

Heck, if a domain's total SERP impressions shows a significant spike, then I'm pretty sure it's going to get a personal look, too - even if no really big money terms are involved. If I ran a search engine, I know that I'd demand that kind of vigilance.
1:35 pm on Nov 19, 2011 (gmt 0)

Full Member

5+ Year Member

joined:Sept 29, 2009
posts:257
votes: 0


Turns out it wasn't him see this conversation [twitter.com...]
4:03 pm on Nov 19, 2011 (gmt 0)

Moderator This Forum from US 

WebmasterWorld Administrator robert_charlton is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 11, 2000
posts:11883
votes: 293


To clarify the above... Matt Cutts did not make the comment attributed to him above. Another Matt... Matt McGee... caught the false attribution.

(Since hash tags break in WebmasterWorld link redirects, and some of the links in the above aren't permalinks, here's an abridged reconstruction with links or urls that will work....)

http://www.potpiegirl.com/2011/11/google-raters-who-are-they/comment-page-1/#comment-13467
Matt McGee November 18, 2011 at 9:12 pm
It would be nice to get some confirmation about whether this is/was really Matt leaving a comment. The underscore in his name and the lack of an avatar makes me think itís not. But I'll welcome being wrong.

http://twitter.com/mattcutts/status/137793239087460352 [twitter.com]
Matt Cutts
@mattmcgee deny--wasn't me. Thanks for spotting.
This 50 message thread spans 2 pages: 50