homepage Welcome to WebmasterWorld Guest from 107.20.25.215
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 50 message thread spans 2 pages: < < 50 ( 1 [2]     
What qualifies a search term/website for human EWOQ review?
smithaa02




msg:4377268
 7:21 pm on Oct 20, 2011 (gmt 0)

As many of you know, "2011 Google Quality Raters Handbook" was recently leaked accidentally to the public. In there, is a detailed guide for EWOQ users (paid by google reviewers) on how to evaluate and rank/classify websites.

Curious stuff... Most interesting to me was their 'vital' rank, which I suspect has been incorrectly applied to a competitor to mine (they're one of these private institutions that pretends to be public).

Does anybody have any idea what qualifies a search term and website combo for review by EWOQ? How many they do? Do they grab say the top 20 websites in the SERP for all say...search terms over 50k searches a month and let the EWOQ team classify these sites? Any ideas?

 

superclown2




msg:4378284
 11:44 am on Oct 23, 2011 (gmt 0)

To answer the original question Mr Smith, I personally would estimate it to be the first 5 websites for terms from around 5k per month.


I have had several sites subjected to these reviews. It has always happened when they broke into the top three, for 'good money' terms. My hypothesis is that it is the value of the search term that matters, not the number of searches.

courier




msg:4378308
 1:29 pm on Oct 23, 2011 (gmt 0)

Has Google created a new defination of Spam in this set of guidelines?

Nuttakorn




msg:4378367
 6:52 pm on Oct 23, 2011 (gmt 0)

In 2010 Google ran 20,000 experiments and got final 500 implementation changes. I think evaluation from google quality raters is just one of their huge list of signals. These evaluation also use in machine learning process that they see the pattern of quality contents.

tedster




msg:4378444
 10:51 pm on Oct 23, 2011 (gmt 0)

Has Google created a new defination of Spam in this set of guidelines?

It's pretty hard to create a "new" definition of a word that's as fuzzy in meaning as "spam" is. But they definitely have focused and refined what they mean by spam... and there seems to be at least two aspects: both the methods used to rank the pages and the content itself.

mhansen




msg:4378466
 12:07 am on Oct 24, 2011 (gmt 0)

I just finished reading a very interesting guide from end to end, for the second time. It appears to me that you end up being queued for a review based on something similar to the manual spam reports. At least that's what I get from it.

- Sites seem to be triggered or distributed based on a keyword, and the ranking of the website in need of a review. (Very similar to what a person submits in a spam report)

- Keyword stuffing, whoa?! They put the ball completely into the raters hands by saying something to the effect of: We ask you to assign a Spam flag if you think the number of keywords on the page is excessive and would annoy or distract the real user.

- Spam ratings (as defined by the document) are separate from quality ratings. It states that pages that are thin content or low quality should be marked as such: low quality, slightly relevant, off-topic or useless. IN ADDITION TO the Spam flag.

- Raters seem to be segmented into specialties or locales, or some other metrics, etc. So I get the feeling its not just a stay at home moms reviewing our stuff.

- Ads, specifically PPC text ads or "Sponsored Links" are being presented very negatively. Yes, they state that not all PPC text ads are bad, but that's like saying... "Red Widgets are known to cause cancer when you look at them... but not all are designed to cause cancer". Any sane person would simply assume its best to assume ALL.

- To be fair, it appears that it takes more than 1 person to list your page as spam. 1 person suggests that its spam, along with a brief and pointed reason, then 2 or more others in your node or team or whatever they call it, also review it.

I wish someone (Like the NY Times, or another mainstream rag) would take this entire document public. It has a "Proprietary and Confidential - Copyright 2011" statement in the footer. Surprisingly, Google itself indexes more than 4900 other pages (link [google.com]) with that same statement. Why are those documents OK for the Google index, but its own "Proprietary and Confidential - Copyright 2011" document deserves to be excluded?

Finally, if our web pages are being assessed by what I assume are our peers, why are we not made aware of the findings in either WMT or some other method? Good, bad or otherwise? If nothing more, it would bee nice if we had the ability to "request a rater review", so we had a better assessment of what Google expects.

tedster




msg:4378491
 2:05 am on Oct 24, 2011 (gmt 0)

What qualifies any website for review is, apparently, showing up on a top page for one of the higher volume searches. Websites are evaluated in the context of a query, not just "on their own". So wouldn't any feedback would need to be on a query-by-query basis?

affiliation




msg:4378575
 9:05 am on Oct 24, 2011 (gmt 0)

I wish someone (Like the NY Times, or another mainstream rag) would take this entire document public

This is what is needed as at the moment it is just a few disgruntled webmasters shouting.

As for what qualifies a search term and website combo for review by EWOQ. I think it is a lot less than 50k, I have sites that performed OK building searches up nicely and one by one they have tumbled when they are reaching around 500 searches daily.

anallawalla




msg:4382908
 2:13 am on Nov 3, 2011 (gmt 0)

Many of these reviewers don't seem very technical and frequently get fired after poor performance reviews.


Here's a current job ad for those reviewers (US applicants only):

Leapforce is looking for highly educated individuals for an exciting work from home opportunity. Applicants must be self motivated and internet savvy. This is an opportunity to evaluate and improve search engine results for one of the world's largest internet search engine companies.


[leapforceathome.com ]

Chance for WW members to do a better job?

tedster




msg:4382916
 2:31 am on Nov 3, 2011 (gmt 0)

Thanks, anallawalla. The bottom of that page has an embedded YouTube video that discloses at least some of the process Google uses to determine and evaluate changes - including the human team evaluations.

How Google makes improvements to its search algorithm [youtube.com]

micklearn




msg:4382944
 4:32 am on Nov 3, 2011 (gmt 0)

What qualifies any website for review is, apparently, showing up on a top page for one of the higher volume searches. Websites are evaluated in the context of a query, not just "on their own". So wouldn't any feedback would need to be on a query-by-query basis?


Or could it also be based on the total number of blocked requests by the feature for logged-in users for a site as a whole?

I wish someone (Like the NY Times, or another mainstream rag) would take this entire document public


It is odd that no mention of it, as far I can tell, has made the mainstream media by now.

lucy24




msg:4382961
 7:08 am on Nov 3, 2011 (gmt 0)

I'm surprised nobody has said anything about the lengthy "off-topic or useless" section. In the version I downloaded it's hard to overlook, because it's all highlighted in painful yellow. Page after page of illustrative examples. So many possible wisecracks, so little time...

Zivush




msg:4383051
 2:43 pm on Nov 3, 2011 (gmt 0)

Amazing,
It is an interesting doc.
However, after reading this doc carefully, I am sure most of you think that it seems like Google doesn't count its software to locate and track poor quality content.
Human rating must be biased.
It reminds me when I was a student working in public surveys and how I was cheating/misleading after several surveys. Even not intentionally.

1script




msg:4383080
 4:02 pm on Nov 3, 2011 (gmt 0)

discloses at least some of the process Google uses to determine and evaluate changes - including the human team evaluations.
"Discloses" is a pretty big word for a PR video. To me it sounded more like they were just trying to say that there's some "human touch" in Google and not everything is decided by a machine (algo) or a nerdy engineer.

Also, the review process ( to the extent that they touch upon it at about 1:00 minute mark in the video) is shown as a tool for fine-tuning results of a query. I understood it as the reviewers having to pick SERPs they like better rather than review the actual sites that the SERP contains.

In other words, they present the human review as a tool used for something that's completely non-controversial: fine tuning of their own product (SERPs). Talking about reviewing other people's sites in my view would not make for a great PR video because they would want to avoid the notion of their employees having a say in whether an online business lives or dies. I think from PR stand point they would rather say that "the machine did it".

Anyhow, as far as I understand the OP's intent was the punitive aspect of the Google manual review. I've yet to see Google disclose that particular aspect.

In my own experience, the sites I've had penalized or banned outright have acquired SERP sitelinks just a few weeks before the event. It has not necessarily translated into more traffic than usual, but when I see a site's sitelinks in WMT no longer empty, I start worrying. Anyone else sees that or am I just being paranoid?

Content_ed




msg:4383081
 4:03 pm on Nov 3, 2011 (gmt 0)

It contradicts itself so frequently that it's hard to blame the raters who at best can only keep a summary of the document in their heads. A lot of:

"If you see X, the site is evil."

and then a page later:

"Not all sites with X are evil"

Which one do you think the raters will remember?

AlyssaS




msg:4388663
 8:16 pm on Nov 18, 2011 (gmt 0)

Ok a word of caution about this ('cause I'm not sure if it is really hm), but someone calling themselves Matt Cutts posted on PotpieGirl's blog saying the following:

[potpiegirl.com...]

Normally we do not comment on ranking methods but Iíll explain a misconception: input from manual raters is used only in the rarest of cases when a non-brand cracks the top ten for high value money terms.


She hasn't yet responded - if it is him, I'm sure she'll try to let us know.

tedster




msg:4388718
 11:33 pm on Nov 18, 2011 (gmt 0)

That comment sure has Matt's tone - and we know her blog is on the Google radar so I don't doubt it.

1script




msg:4388722
 11:52 pm on Nov 18, 2011 (gmt 0)

the rarest of cases when a non-brand cracks the top ten for high value money terms
So, I guess that's where we are headed: no non-brands for high value money terms in the first 10? Interesting, not even a word about relevancy for the term. Well, at least he didn't mince his words this time.
tedster




msg:4388726
 12:13 am on Nov 19, 2011 (gmt 0)

I never had any doubt - if an unfamiliar site starts ranking for a big money term it will get a manual check. That doesn't mean it can't rank ever, but if the manual raters question the ranking as well, then its time on top will be limited.

Heck, if a domain's total SERP impressions shows a significant spike, then I'm pretty sure it's going to get a personal look, too - even if no really big money terms are involved. If I ran a search engine, I know that I'd demand that kind of vigilance.

AlyssaS




msg:4388862
 1:35 pm on Nov 19, 2011 (gmt 0)

Turns out it wasn't him see this conversation [twitter.com...]

Robert Charlton




msg:4388883
 4:03 pm on Nov 19, 2011 (gmt 0)

To clarify the above... Matt Cutts did not make the comment attributed to him above. Another Matt... Matt McGee... caught the false attribution.

(Since hash tags break in WebmasterWorld link redirects, and some of the links in the above aren't permalinks, here's an abridged reconstruction with links or urls that will work....)

http://www.potpiegirl.com/2011/11/google-raters-who-are-they/comment-page-1/#comment-13467
Matt McGee November 18, 2011 at 9:12 pm
It would be nice to get some confirmation about whether this is/was really Matt leaving a comment. The underscore in his name and the lack of an avatar makes me think itís not. But I'll welcome being wrong.

http://twitter.com/mattcutts/status/137793239087460352 [twitter.com]
Matt Cutts
@mattmcgee deny--wasn't me. Thanks for spotting.

This 50 message thread spans 2 pages: < < 50 ( 1 [2]
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved