| 9:53 am on Jun 6, 2005 (gmt 0)|
|that made it sound like people were reaching in via this console to tweak results directly, which just isn't true at all. |
If I understand GG correctly - what he's saying here is that the ratings aren't used directly to drop sites from the index, just to provide data on human evaluations with which to tune the algo.
That makes perfect sense - and I genuinely don't doubt GG for a moment.
I'm a bit confused about the term "whitelist" in the document though. I guess this could simply mean - "here's an example of a site which humans like" - and the site then takes it's chances in the algo tweaking like any other.
But the term "whitelist" SEEMS to imply "hands off this (human evaluated) site". ie manually exempting it from any "anti spam" penalties in the algo tweak.
Of course since this document was never intended for public release, it's likely nobody ever went through it checking that none of the words implied anything incorrect(!).
Still - given that I understand from GGs statement above that sites are not being penalised manually in this process, it would be interesting to know if "whitelisted" in this context is a hint to the algo tuners, or an exemption from spam penalty.
| 9:57 am on Jun 6, 2005 (gmt 0)|
|Please reread my entry. Debbie Frost is not a student, but Google's official spokeswoman. |
I stand corrected! :)
|... it would be interesting to know if "whitelisted" in this context is a hint to the algo tuners, or an exemption from spam penalty. |
If you read the paper on Combating Web Spam with Trust Rank (link provided in post number 26) you may get a better understanding. Its worth the read!
| 10:35 am on Jun 6, 2005 (gmt 0)|
If you check my postings, you won't find any value judgement. Why? I'm not sure yet if the Hub is good or bad.
* If you decide to evaluate one single URL, you should evaluate all URL's. There's no way back once you started.
* I'm worried about whitelists. Google should check the sites over and over again to be sure that the whitelisted sites still meet the criteria. But they don't check them over and over again...
* The international agents a.k.a. students proved their value. Rater Hub Google is active for years - years in which Google grew to great heights.
* Evaluation of search results is quite normal, but Google takes it too great heights. The company shows a deep concern for the best search results.
| 10:40 am on Jun 6, 2005 (gmt 0)|
There is very good proof that google does hand edit it's results. I have seen it happen. It happens all the time.
| 10:55 am on Jun 6, 2005 (gmt 0)|
Can you give me a few examples of hand edited sites (by Google)?
| 11:08 am on Jun 6, 2005 (gmt 0)|
This is yet another one of these non-news threads that crops up regularly here.
I just dont understant the shock and amazement that follows the revelation that Google uses human quality control.
To suggest that a bunch of worldwide students in the employ of a multi-billion dollar corporation somehow reaches in and manipulates the serps is just too ludicrous for words.
GG shouldnt need to step in for clarification if everyone just used their common sense.
| 11:19 am on Jun 6, 2005 (gmt 0)|
|Can you give me a few examples of hand edited sites (by Google)? |
If you search this thread harder you will most likely find one.
| 11:21 am on Jun 6, 2005 (gmt 0)|
I'd disagree - this is big news as far as I'm concerned.
No these student are not actually editing SERPS as GG says - but their evaluation DOES have a direct result on the SERPS - the fact that it is not them who are actually pushing the buttons is irrelavent in my opinion.
added: GG answered a question i posted about Spam removal in the "questions for GG thread" - basically that it was being hit a lot faster these days - i think we now know why!
[edited by: UK_Web_Guy at 11:26 am (utc) on June 6, 2005]
| 11:24 am on Jun 6, 2005 (gmt 0)|
That's Voelspriet to you:) Found it, thanks (Blush)
| 11:33 am on Jun 6, 2005 (gmt 0)|
I really don't see what all the fuzz is about. Google never made a secres of it's human evaluation of search results by my knowledge. That's not the same as giving one individul the right to mess around with the SERPS.
From the documents it's quite clear that the raters are given real past query results to evaluate and that those results are rated by groups of people. If the raters don't agree, they're evaluated again and discussed. It's not quite sure how exactly Google uses those ratings, but hey, does that matter much? As long as they are used to improve search results.. Any other search engine will have this kind of quality control, only the implementation may be different.
I must say I'm quite impressed by the way the documents urge the raters how to distinguish between usefull and non usefull pages and how to detect spam/nospam, e.g. sneaky and non sneaky redirects. (For the spam detection part it may have been updated and expanded by now with other techniques, but the intention is probably still the same).
The guidelines are very well in line with the SEO mantra: provide unique content. So I think I like those guidelines :)
| 11:34 am on Jun 6, 2005 (gmt 0)|
|If you decide to evaluate one single URL, you should evaluate all URL's. |
Under the heading of "Bad" you made the above statement. That in and of itself is a value judgement. Perhaps its not included in your postings on your site, but it isn't difficult to guess what side of the fence you sit on in regards to the rating system.
You further state:
|I'm worried about whitelists. Google should check the sites over and over again to be sure that the whitelisted sites still meet the criteria. But they don't check them over and over again... |
How on earth would you know they don't check white listed sites over and over again? Are you a Google staff member involved with this programme? Do you know someone who is and who is sharing this information with you?
Under the heading "Good" you said,
|Evaluation of search results is quite normal, but Google takes it too great heights. |
Does that mean they take it too far in your opinion?
You know, nobody seems to be paying too much attention to "Trust Rank" in the grand scheme of Google's algorythms, filters, etc.
How would anyone know if Google hasn't already created several of their own seed sites (created by Google employees using Google's guidelines) by which to gauge all white listed or "trusted" sites?
If I were Google, that's exactly what I would do! I'd build several sites in several competitive markets and donate them to various charities to operate. However, any and all changes would have to be done by Google employees to ensure their continued adherance to Google's guidelines. Its a sure fire way to know what standards are considered 100% on the up and up.
Think about it. Google builds an acceptable affiliate site, marketing "whatever" and using what they consider to be acceptable SEO techniques. They then approach some charity to monitor and process orders, sell and ship the goods and collect the profits. But Google controls the site at all times.
Now multiply that effort by a set of several sites in various niche markets. To me, this would guarantee an excellent base by which to evaluate other sites ... while "doing no harm".
Obviously, I am not saying Google has done this or has even considered doing anything like it ... but there is nothing preventing them from doing so and to be honest, I would if I were in in their shoes.
| 11:41 am on Jun 6, 2005 (gmt 0)|
>Can you give me a few examples of hand edited sites (by Google)?
I think I've seen some of those on several Dutch websites last week...
|I'd disagree - this is big news as far as I'm concerned. |
I don't see how this can be big news or why it's in the news now as it's been around for ages. Last UPDATE of one of those documents is 31 December 2003. That's 18 months ago.
| 11:57 am on Jun 6, 2005 (gmt 0)|
Those documents etc might have been around for 18 months, but did we all have access to them 18 months ago?
That's why this is in the news now - because such an insight into how this thing works has never been made available before.
| 11:58 am on Jun 6, 2005 (gmt 0)|
>I really like the human-reviewer angle. Google could > hire a lot of cheap brains by using college kids.
do you want to trust your livelyhood to a college kid who might have a bias, or be in a bad mood? What if he /she clicks on th ewrong checkbox?
as far as I understand this (after GG explained it):
this is not to penalize site A or B. It is to test the serps if they're better or worse after updates. The .doc file is to guide students during that procces. Example: if you search for "widget" and a thin affiliate or a spam site comes up first, the search results aren't that good.
Am I missing something?
On edit: I think Google is not happy because they have gone out of their way to say that results are automated, we fight spam via the algo, etc. etc. I was looking at the sites mentioned there with hidden text (one still has it); they're still on the index. I wonder if "bad" sites they're just buried in the serps, instead of banned. This could explain a lot of MIA sites
[edited by: walkman at 12:17 pm (utc) on June 6, 2005]
| 12:02 pm on Jun 6, 2005 (gmt 0)|
|do you want to trust your livelyhood to a college kid who might have a bias, or be in a bad mood? What if he /she clicks on th ewrong checkbox? |
I have a feeling that major anomolies not in agreement with other raters would easily stand out.
| 12:11 pm on Jun 6, 2005 (gmt 0)|
I really wish to thank Henk van Ess for bringing the web community such good news; Google is doing something to improve the quality of the serps!
That is exactly what the majority of fellow members have been asking for. Choose any relevant thread of these forums, and you shall read post after post of members complaining why Google doesn't do enough about spam sites and scrapers which occupy the top of serps in their niches.
Those spam sites are the real ENEMY #1 for any decent whitehat- and AdSense publisher. And thank you Google and Google´s folk for doing something about it!
| 12:23 pm on Jun 6, 2005 (gmt 0)|
> I have a feeling that major anomolies not in agreement with other raters would easily stand out.
I agree. I was talking about, a one time thing. One student from (insert country here) being judge, jury, and the executiour.
Plenty of people complain of DMOZ mods nuking their competitor's sites. With Google it would be worse. If they choose to just bury your site instead of banning it, there's no way to tell what happened, or who to appeal to.
| 12:28 pm on Jun 6, 2005 (gmt 0)|
Interesting site, voelspriet -- I found some of your comments on <snip> interesting. :)
[edited by: lawman at 2:01 pm (utc) on June 6, 2005]
[edit reason] No Blog Links Please [/edit]
| 12:47 pm on Jun 6, 2005 (gmt 0)|
|I agree. I was talking about, a one time thing. One student from (insert country here) being judge, jury, and the executiour. |
My take on it is that there are several raters all rating the same pages. I don't think what you are describing could or is likely to happen.
| 12:59 pm on Jun 6, 2005 (gmt 0)|
|I found some of your comments on this post especially interesting. :) |
Whoisgregg, that's Dutch satire [google.com]
Won't try again to be funny. Not my style.
| 1:09 pm on Jun 6, 2005 (gmt 0)|
|Let's go back to the content. Check the discussions on Search Engine Watch or many other professional SE-boards. Google Guy, do you really think it's irrelevant to talk about Google's Human Quality Evaluation? |
It's an interesting topic, but quite frankly I would have been surprised and disappointed if Google didn't have several layers of quality control in place.
|Let's go along with your reasoning. If you say agents don't have any influence on the index, I have a question for you. Why pay them for something if it has no effect om the index? Must be charity then. |
You're playing that "reporter" semantics game here. GoogleGuy mentioned this:
|reaching in via this console to tweak results directly, which just isn't true at all. |
Even I can influence results when I send in a spam report and Google doesn't even pay me. Just because I send in a report doesn't mean Google has to agree or change anything. I didn't see any information in your "story" about what Google does with this information. I can think of lots of ways they could use the information without tweaking the results directly.
You've created a "buzz," enjoy it while it lasts. My prediction - 14 minutes of fame.
| 1:10 pm on Jun 6, 2005 (gmt 0)|
|Now multiply that effort by a set of several sites in various niche markets. To me, this would guarantee an excellent base by which to evaluate other sites ... while "doing no harm". |
Yes, but IMO the problem is that the sites they say are "good sites" sometimes aren't any different than the sites they say are bad.
Take redirects. Does anyone honestly see a difference between the "sneaky redirects" and the "legitimate" ones? What kind of signal does that give to webmasters?
| 1:26 pm on Jun 6, 2005 (gmt 0)|
>but did we all have access to them 18 months ago
There you have a point :)
Perhaps I should rephrase my surprise to "I don't understand why everybody is suddenly screaming Human Intervention". I also don't see how people can contribute recent rank losses to this piece of information.
| 1:27 pm on Jun 6, 2005 (gmt 0)|
I think its all silly really. If google does something like give the power to one person or a group of students (once named) could be very difficult to hide the potential if they had the power to reach in and pull a website on ethics, or better yet a competitor sending a paycheck to them.
If such a system exists the next deep throat that unveils such a evil system that even black hats wouldn't do. I mean its a honour among thieves issue. Now whether the thieves are google the students or a text link broker site the owners should all be displayed in public and paraded around on a stick. BTW how's goog doing on nasdaq... checks... hmmmm
| 1:32 pm on Jun 6, 2005 (gmt 0)|
|You've created a "buzz," enjoy it while it lasts. My prediction - 14 minutes of fame |
I'm from Europe. I don't long for fame. I'm a busy reporter with a hobby (search engines). You can sue me for that.
Let me explain how and why I blogged some details about 'Rater Hub Google'. A student showed it to me months ago. Only last month another international agent was prepared to send me some material. Most information was from 2004 but considered still valid (according to three other agents that contacted me <snip>.
The published insights are not that spectacular. But insight in Google's evaluation of websources is rare. I wanted to forward the details to the web community to get some discussion. Why?
People should know how a search engine works. Basically, it's a stupid thing. Intellect has to come from the user. If he/she doesn't ask a smart question, he/she gets a stupid answer. If you really want to have adequate answers, there's no way of escaping advanced syntaxes. Most people don't use inurl: allintitle or site: or numrange or whatever. They don't filter the answers of Google with advanced searching.
For most users, Google's algorithm decides the order of the answers - helped by some input of the Rater Hub. I think it has purpose to publish how the raters think, how they refine queries and what is considered as spam and what not (whitelist). You can evaluate your answers better if you know f.e. that Kelkoo is on Google's whitelist and is considered as an important site.
[edited by: lawman at 2:05 pm (utc) on June 6, 2005]
[edit reason] No Url Drops Please [/edit]
| 2:22 pm on Jun 6, 2005 (gmt 0)|
> You've created a "buzz," enjoy it while it lasts. My prediction - 14 minutes of fame.
what's up with the anger towards this guy? He found something interesting and reported it. I could see why google is mad at him, but face it, it's a great find /story and most tech reporters would've loved to have reported it first. We simply don't know the role that these raters truly play in the SERPS. No offense to anyone, but Google will try to minimize their role, it's on their self-interest--no matter what the truth is. If I worked for Google I would the same, no questions asked. On the other hand, maybe they truly play a minimal role, we just have to keep an open mind on this.
14 minutes of fame, it's about 14 more than most of us anyway. ;)
| 2:40 pm on Jun 6, 2005 (gmt 0)|
|People should know how a search engine works. |
I understand what you are saying. By understanding how the engine works, they can get better results. But most users want the engine to do that thinking for them - as lazy as it sounds.
|14 minutes of fame, it's about 14 more than most of us anyway. ;) |
Agree, 14 minutes more than me...
Don't get me wrong, voelspriet is merely taking advantage of the interest in Google. Good for him. I'm only predicting that the interest is short lived because the buzz is around the webmaster community - which is easily distracted.
I consider myself a fairly objective person. I do take exception to voelspriet's post twisting GoogleGuy's words around. (And I don't beleive in kissing up to GG, I just call 'em as I see 'em.)
| 2:47 pm on Jun 6, 2005 (gmt 0)|
|what's up with the anger towards this guy? |
I think you're misinterpreting as anger, what some people consider to be "old news". Some of us just don't see what the big deal is all about!
Sure, Mr. Ess found something which most of us aren't privy too ... but then again, we aren't supposed to be privy to it. According to Google Guy, the page is protected and marked "Google Proprietary and Confidential". Further, the student who revealed this info has seemingly (according to Google Guy) broken a non-disclosure agreement he or she had with Google when they signed on.
I'm not angry at Mr. Hess and I doubt anyone (other than those who work for or hold shares in Google) is angry either. I just think its much ado about nothing new!
Exactly what has Mr. Ess revealed that many of us didn't know about in the first place? Without getting into the specifics of the form, (which we were never supposed to see in the first place) ... he has not enlightened us any further about anything.
In post number 26, you will find a link to Google's "Quality Raters" job opportunities listing and also a link to the Trust Rank article which was written on March 1, 2004 and has been the subject of many discussions here on WebmasterWorld and other forums over the past few months.
I see nothing new in his blog representing value to anyone ... except perhaps a handful of lawyers! ;)
[edited by: Liane at 2:53 pm (utc) on June 6, 2005]
| 2:51 pm on Jun 6, 2005 (gmt 0)|
In reply to LinkJack's comment:
"Develop content, add value to affiliates" is not SEO, argue at will."
I disagree. Search engine optimization should not be defined by a set of tactics.
“SEO” is the practice of enhancing a site's listings in the SERPS over the long-term.
In order to this most effectively, a search engine optimizer must help his/her clients appeal to the algorithms employed by the likes of Google. Since Google designs its algorithms to prioritize sites that add the most value to users, the best SEO tactics focus on just that: Adding value.
It's the SEO's job to do everything it takes to develop a site that adds value to users, and makes it easy for search engines to index sites.
Accordingly, the effective, ethical SEO professional needs to become an expert at creating value for searchers and end-users.
SEO is not dead. If anything, it's just changing... for the better.
Good SEOs have the opportunity to work with the Google's of the world to enhance the Internet by advocating the development and efficient dissemination/structuring of quality, useful content.
| 3:01 pm on Jun 6, 2005 (gmt 0)|
Liane, it's not mr Hess (ouch, that's a German nazi), nor mr. Ess, but Henk van Ess.
|but then again, we aren't supposed to be privy to it. According to Google Guy, the page is protected and marked "Google Proprietary and Confidential" |
Google sure knew what to expect in advance. On May 24 I wrote to the press office (for another story) and ended with a PS:
PS Will contact you in two weeks about Google Evaluation Lab.
Only one week later (now you see how untrustworthy I really am) I wrote:
Here are my questions:
1. How many international agents do you have for Google Rate Hub?
2. Do you pay them $20 an hour?
3. Do you recruit through Monsterboard and universities?
4. Last time (Google Hacking), you told me:
"We index content that we find posted online through an
automated process of crawling the web"
Eh... but why the human evaluation then?
After that question, Google Guy took over. Debbie Frost wrote to me:
I believe Googleguy has already responded to the story you submitted to
If the material was copyrighted, they really would have told me.. I did receive some material with a (C) in it, but didn't publish that documents.
| 3:21 pm on Jun 6, 2005 (gmt 0)|
|Liane, it's not mr Hess (ouch, that's a German nazi), nor mr. Ess, but Henk van Ess. |
My sincere apologies ... that really was a boner on my part!
|Google sure knew what to expect in advance. ... |
Regardless of what private communications you may or may not have had with Google employees (which by the way you are not supposed to share with us on WebmasterWorld - please see TOS), I am certain they did not tell you that it was just fine to go ahead and display their form on your blog. As stated before by Google Guy the form is reportedly marked "Google Proprietary and Confidential"
How can you justify this action by claiming you weren't aware of any restrictions or stating that Google knew what to expect? Your arguments don't make any sense? Did you tell them you planned to display a screen shot of their form? I don't see that in your communications!
Someone asks "how high" and you answer ... "yellow".
[edited by: Liane at 3:24 pm (utc) on June 6, 2005]
| This 201 message thread spans 7 pages: < < 201 ( 1  3 4 5 6 7 ) > > |