homepage Welcome to WebmasterWorld Guest from 54.237.125.89
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 201 message thread spans 7 pages: < < 201 ( 1 2 3 [4] 5 6 7 > >     
eval.google.com - Google's Secret Evaluation Lab..
"Rater Hub Google" Rumours?
Imaster




msg:773420
 7:09 am on Jun 2, 2005 (gmt 0)

On Apr 19, 2003, some members had spotted referrals from
eval.google.com/happier/quest/rateall.py
followed by a question number and couple of different email addresses. You can read earlier threads here,
[webmasterworld.com...]
[webmasterworld.com...]

According to some [slashdot.org] sites [searchbistro.com],

It's one of the best kept secrets of Google. It's a mystery on Webmasterworld. Also in Europe (France) they don't know what to expect from that odd URL [eval.google.com....] Click it and you get ...nothing. The site reveals itself only if you have the proper login and if you use a network known by Google. Residues of Eval.google are found on the web, but the full content of the mystery site has never been published before. Here it is: the real story about Eval.Google. They use... humans!

The site claims it is some kind of the secret google evaluation lab!

 

JulianDay




msg:773510
 8:48 pm on Jun 6, 2005 (gmt 0)

Let's not get too silly about copyright! It starts to look like smoke and mirrors.

(My observations are copyright free and for good reason: they're just an opinion with no intellectual content.)

GoogleGuy




msg:773511
 8:51 pm on Jun 6, 2005 (gmt 0)

Whaaa? How did Henk know that I'm really Larry!? Doh, I said it out loud! Doh! Just kidding. :)

I sincerely appreciate it if you've stopped posting documents and taken out the employee's name. Let me tackle the last question you asked:
Please explain. I saw in Eval several duo-list based on the same search terms. Most duo-lists show a different order of answers then the other list. The raters were asked to choose which answers were the best. If this is not filtering, what is it then? I have many other examples.

Think of it like a taste test. If a drink maker had an idea for how to tweak their formula, they might have one version with more vitamin C, or another version with more sugar. It's natural to ask testers for their feedback. But you wouldn't say that the taste testers were the directly changing the formula that was sold in stores. It's even less directly tied with search results: if there's a slight preference for one type of scoring, but it takes 100x the computing power for that ranking, it may well be a better choice to use that 100x computing power for a different task that improves quality more.

I think it's absolutely a great idea to collect feedback/quality ratings about different types of algorithms. But hopefully the analogy of a taste test shows that we may collect feedback without it actively altering our search results.

JulianDay




msg:773512
 8:58 pm on Jun 6, 2005 (gmt 0)

GoogleGuy,

Whatever row you are having with someone (which I don't understand; or indeed care about.)

This thread does suggest that Google has a human input.

It may not be direct human input, but it exists, and
has always been denied.

Has Google been untruthful about this in the past?

Dayo_UK




msg:773513
 9:00 pm on Jun 6, 2005 (gmt 0)

Thats right Larry. ;)

When I was a little kid we used to have to test drinks at our school sometimes :)

We had to say which one tasted nicer (thing is we always used to say that they all tasted nice - that way you got to finish the drink)

Wish we got paid $10-$20 though.

JulianDay

Did you really think that Google had reached the stage of no human input - we have not reached the stage of IRobot yet I think.

ratherbeboating




msg:773514
 9:00 pm on Jun 6, 2005 (gmt 0)

Google Guy,

It certainly sounds wrong for someone to take Googles "secret" documents and put them where they are visible to others.

As it happens, I have found stuff that people realized too late was secret and removed from the web, in the google cache. I am sure it has happened at least ten times that someone will have information that they decided to hide, and are either unaware of the google cache, or don't know what to do about it, and I will then get the info that I need.

It seems that there is at least a bit of "wringing of hands" here that is undeserved until Google gets rid of the cache. (Of course, I would hate to see it go.)

No disrespect intended, it just seems like very similar things, with similar arguments for and against. (Most websites do say copyright on them.)

voelspriet




msg:773515
 9:08 pm on Jun 6, 2005 (gmt 0)

I love close reading.

...without it actively altering our search results

Why don't you say:

...without it altering our search results

akmac




msg:773516
 9:14 pm on Jun 6, 2005 (gmt 0)

"Why don't you say:

...without it altering our search results"

Because, he wanted to be accurate. Obviously, the only purpose of such experiments is to improve search results. I think the "active" versus "passive" line has been adequately presented by GG. Why flog it?

JulianDay




msg:773517
 9:16 pm on Jun 6, 2005 (gmt 0)

Google has repeatedly stated that the algos alone determine serp position.

I appreciate that individual humans may not have altered particular serps.

But there is now no doubt that there has been human input into serps.

I am not saying this is a bad thing. It may well be a good thing.

But Google has, for as long as I can remember, always denied any human intervention in the serps.

I hope that sums it up.

[edited by: JulianDay at 9:20 pm (utc) on June 6, 2005]

voelspriet




msg:773518
 9:19 pm on Jun 6, 2005 (gmt 0)

I love close reading [google.nl]
Do you mean by flog:
a. beat severely with a whip or rod;
b. English and New Zealand slang for sell.
c. treat roughly or without respect

Seriously, the whole point is that Google uses human input. Only last week I got a quote from Google that they can't help it that secret documents are spidered, since machines are doing the job.

[edited by: voelspriet at 9:24 pm (utc) on June 6, 2005]

TypicalSurfer




msg:773519
 9:19 pm on Jun 6, 2005 (gmt 0)

But hopefully the analogy of a taste test shows that we may collect feedback without it actively altering our search results.

Analogies only work for the person that uses them.

If it was just a taste test you only need two buttons, "yucky" and "yummy" to rate serps, whats the deal with rating individual pages?

[edited by: TypicalSurfer at 9:31 pm (utc) on June 6, 2005]

akmac




msg:773520
 9:26 pm on Jun 6, 2005 (gmt 0)

I can see your point. However, if the results from testing are used to modify the algorithm to return better results...? Does anyone see a problem with this?

If, as you infer, there is manual manipulation of the results (versus algotithmic manipulation) then I agree with you wholeheartedly. However, nothing in the provided documentation (or dialogue) points conclusively that that is the case.

PS: (A)

walkman




msg:773521
 9:32 pm on Jun 6, 2005 (gmt 0)

>> It *is* Trade Secret information.<<

No it isn't. The source code of Google's algo is, not a memo on what is spam or not. If Coca Cola has a memo on how to spot chipped bottles (using common sense info available elsewhere), that is not a trade secret, not matter how much they say it is. Their formula is though, unless...blah blah blah.

>> If there is no legitimate news reason, and the journalist had reason to believe that it was still trade secret, they can be held liable for damages. <<

Here's how I'd justify it: "Google claims that math solves all problems and that robots do it better; this memo shows different."
Of course it's not entirely true but true enough to get First Ammen. protection in USA. To say this is not newsworthy, is dishonest. Now if he had stolen, or hacked Google to get it, it's totally different.

voelspriet




msg:773522
 9:32 pm on Jun 6, 2005 (gmt 0)

However, if the results from testing are used to modify the algorithm to return better results

Sounds good, doesn't it? But what are the criteria for evaluation and what URL's are chosen? Why is Kelkoo on Google's whitelist and some of the competitors not? Why is children.pr-e-g-n-a-n-c-y-p-a-m-p-e-r-s.co.uk according to Google a 'sneaky redirect' and www.film.com is not? These kind of questions interest me...

[edited by: Brett_Tabke at 10:16 pm (utc) on June 6, 2005]
[edit reason] obfuscated link [/edit]

PatrickDeese




msg:773523
 9:33 pm on Jun 6, 2005 (gmt 0)

> But there is now no doubt that there has been human input into serps.

Was there ever any doubt?

- Google has used DMOZ directory data for over 5 years - using the opinion of thousands of editors to back up their algorithmic ranking.

- Whenever Google has made an algo change, they've used JS to track visitor's clicks - aka human feedback.

- The patent filing that was released specifically mentions "TrustRank" and how it was to be acheived (through human feedback)

- obvious the search engineers check sets of search results to ensure that the end results of changes are appropriate.

--

What is very apparent is that these "hub raters" do not tank individual sites - they rate serps and google uses human feedback to tweak their algorithmic results.

This has been going on for a long time and claiming that the hub raters caused a relatively new site to "disappear" from the serps is nothing more that an odd combination of paranoia and hubris.

If your site can't pass human inspection and is ranking, it is going to get removed from the Google serps eventually - either through an update, or human intervention instigated by a spam report, or a keyword spot check.

Lesson: make your sites unique and beneficial for the end user - nothing shocking about it.

[edited by: PatrickDeese at 9:35 pm (utc) on June 6, 2005]

claus




msg:773524
 9:33 pm on Jun 6, 2005 (gmt 0)

I noticed the job postings a short while back, but as far as I recall those were limited to the US. Perhaps those were not for the "eval" team, but for something else?

Nevermind, I for one welcome... uhm... For a long time I've been thinking that a higher degree of human involvement in web page rating was indeed necessary. Not just "nice". Humans simply perceive things in another way than bots (if one can indeed speak about bot perception).

>> Why is Example1 on Google's whitelist

I haven't seen said list, nor evidence2 that it exists, but if it does, that would definitely not be the way to go. A "spammer" is a "spammer", regardless of site name or URL. There is more than enough confusion about what "spam" is and isn't without such a list confusing matters even more.

Webmasters routinely become both confused, misinformed, and scared, even paralyzed for fear of doing the wrong things. This does not benefit the www, not the webmeasters, and not the search engines. Openness, clarity, and firm clear-cut principles that are valid for everyone (and either are enforced across the board or not at all) is the only way.

If it isn't "spam" when "Company X" does it, but "spam" when "Company Y" does it - then what is it, exactly? And with such an attitude, how are we (webmasters and advisors) going to take anything on this matter seriously, ever?

---
1) voelspriet, you might want to hold back on direct references to external sites due to board TOS
2) I haven't read the full thread, or the /. story, or the blog, sorry about that. Too little time...

[edited by: claus at 9:51 pm (utc) on June 6, 2005]

bird




msg:773525
 9:35 pm on Jun 6, 2005 (gmt 0)

Surprise!

Google's algorithms are created by humans!
And they get changed based on human feedback (= evaluation)!
The world is coming to an end!

voelspriet




msg:773526
 9:37 pm on Jun 6, 2005 (gmt 0)

Google's algorithms are created by humans!

Oh well, I'm off. Loved the discussion! Will come back here more often.

akmac




msg:773527
 9:40 pm on Jun 6, 2005 (gmt 0)

"Surprise!
Google's algorithms are created by humans!
And they get changed based on human feedback (= evaluation)!"

Awww..... Bird, you were supposed to save that for page 50. ;-)

nemo2




msg:773528
 9:45 pm on Jun 6, 2005 (gmt 0)

what I read at Henk van Ess's Search Bistro ,I recon will bring a lot winds of war in the marketing industry(ie nuking affiliate sites that have the codes and urls of several well known affiliate marketing companies). That will damage economically as well the 90% of all the WebmasterWorld members that run affiliate programs.

MikeNoLastName




msg:773529
 10:14 pm on Jun 6, 2005 (gmt 0)

Most important of all, what is Google planning on doing concerning re-evaluation those who feel they have been penalized when they really shouldn't have been and do about the mistake? If this doc is proven to be real, they can no longer claim their typical canned responses, about individual sites are not penalized, yada, yada... I would venture that this topic covers a lot of the people who were recently dumped in the SERPs during Bourbon without any apparent reason. I don't expect just the algorithm tweaking to remove an obvious "Offending site" penalty. In our case we were apparently OK for 10 years, because we have never done any affiliate programs, our links are strictly flat monthly advertising rates, we have tons of unique content, etc. It apparently was only our recent addition of Adsense ads, upon Googles own contacting us and requesting we try it after reviewing our site, that got us put into "offensive content" territory per page 5/6 of the doc. We haven't changed anything else on the site since then. Even Adsense support and our own rep told us directly that if they ever noticed a TOS violation or such they would contact us and "work with us" to fix things rather than cancelling our account. Now they simply dump us from the SERPs without notice and threaten even our prior income from our other prior advertising clients as well, who no longer want to advertise with a dumped site. Well, Thanks a LOT G!

JulianDay




msg:773530
 10:26 pm on Jun 6, 2005 (gmt 0)

Some misunderstandings here and there <snip>.

SE algos look for good search results (the input and output to this algo is complex. <snip>

The algo is created by humans <snip>.

[edited by: lawman at 10:46 pm (utc) on June 6, 2005]

Dayo_UK




msg:773531
 10:31 pm on Jun 6, 2005 (gmt 0)

<snip>Sorry Brett - aint helping am I by getting wound up - I will just ignore :) - looks like other members have - no-one else seems to take any notice of him</snip>

[edited by: Dayo_UK at 10:42 pm (utc) on June 6, 2005]

Kangol




msg:773532
 10:32 pm on Jun 6, 2005 (gmt 0)

Is a good thing that Google is using human raters, hopefully they will be educated enough to be able to spot the good from the bad.
I've looked a little bit on the spamguide and there are some basic things which we know to stay away from however I do not see any guidelines regarding how to spot a blog/forum/wiky spammer.
In my opinion this is what we should fight against.

reseller




msg:773533
 10:34 pm on Jun 6, 2005 (gmt 0)

nemo2

>what I read at Henk van Ess's Search Bistro ,I recon will bring a lot winds of war in the marketing industry(ie nuking affiliate sites that have the codes and urls of several well known affiliate marketing companies). That will damage economically as well the 90% of all the WebmasterWorld members that run affiliate programs. <

I don't think that Google is waging a general war against affiliate program marketing, and there is no indications that Google is doing so.

Google is just trying to clean up the serps of affiliate link farms, banners farms and affiliate pages with no values added.

Affiliate program marketing pages with good contents are still ranking well on the serps.

So please go ahead and join serious affiliate programs and just remember to create valuable PRE-SELL pages. Good luck.

UK_Web_Guy




msg:773534
 11:07 pm on Jun 6, 2005 (gmt 0)

If this eval system isn't intended to highlight specific pages/sites as Spam and ultimately lead to their removal/demotion in rank - then why does the Spam Guide go into as much detail as to how to spot hidden text/affiliate links etc.

These are things that are year's old - so how can a Rater saying this site has x amount of hidden text or this site has x number of affiliate links help to improve the algorithm?

It can't, if the algorthim could handle these things or G wanted to base ranking factors on the use of them then it would already - so the only purpose of these reviews is to highlight individual cases. Which to me means manual reviewing of the SERPS to remove the "Spam".

I can appreciate the irritation that GG has about this info being made public and I think the fact that he is, or appears, so irritated also speaks for itself.

jk3210




msg:773535
 11:24 pm on Jun 6, 2005 (gmt 0)

<<so how can a Rater saying this site has x amount of hidden text or this site has x number of affiliate links help to improve the algorithm?>>

As I read the guide, a rater can improve the algo by giving input that goes something like "even though this page is 99% aff links, it also has 1% OTHER content that I as a human find helpful, and therefore this page should not be flushed."

I mean, how could any group of SE geeks EVER succeed in writing an algo that took into consideration ALL potentially "useful content?" How would an algo know what was "useful" unless a human told it?

As a perfect example, I've been looking at those *otelguide.net pages given as a no-spam example since 1998 because they are my competitor, and I NEVER saw that "video" link the guide pointed to as an example of useful content.

BillyS




msg:773536
 11:32 pm on Jun 6, 2005 (gmt 0)

I find it funny that some folks around here compare voelspriet's reporting to that of "deep throat." He's posting NDA-covered materials that Google probably considers a trade secret. I'm not sure any business would appreciate it if employees started disclosing what they might deem competitive information.

So what is the motive - mainstream reporting of news? Nah, we all know it's not really news. Personally, I figured they used human's to evaluate results. That's quality control 101. So why would voelspriet post this information?

My guess - traffic. He realized that he could make a good buck by posting the information. But what about the risk of Google lawyers? Ignorance is a pretty shaky defense.

Voelspriet is probably betting that Google wants to keep this quiet. After all, the mainstream public has no idea how a search engine really works. Most people would come away shaking their heads, coming to the wrong conclusion - Google is really just a bunch of people sitting in a room looking at results.

walkman




msg:773537
 11:36 pm on Jun 6, 2005 (gmt 0)

>> I find it funny that some folks around here compare voelspriet's reporting to that of "deep throat."

I find it funnier that people don't read, or understand the entire post before picking a word or two and slamming it.

MikeNoLastName




msg:773538
 11:44 pm on Jun 6, 2005 (gmt 0)

What UK_Web_Guy says relates with my opinion. As any AI language programmer knows, if they already had a firm enough grip on what the algorithm had to do in order to weed out these sites to write this general document, then they would have done so and only used humans to catch SPECIFIC urls missed or accidently hit by it. To refine it. Perhaps this human created whitelist/blacklist is the extra .5 that GG was talking about. If so I would have expected them to put it in much sooner.

BillyS




msg:773539
 1:02 am on Jun 7, 2005 (gmt 0)

Too bad we weren't suppose to see it. That's not it works, at least in USA. We weren't suppose to know of the Watergate break-in, the Enron tape recordings, Pentagon papers, Nixon tapes, MCI e-mails, and thousands of other (relevant or not) internal memos from other companies.

It's even funnier how someone tries to compare illegal activities of companies with trade secrets. Maybe I just misunderstand your point with this post.

joeduck




msg:773540
 1:11 am on Jun 7, 2005 (gmt 0)

GG's concerns are understandable given that this creates problems for Google and that the "sinister" nature of this information has been exaggerated.

What has NOT been exaggerated is that Google has almost certainly misled the community about how important human decision making is to the process of determining spam.

I think that human intervention makes a lot of sense, but I've been frustrated yet again after taking Google support notes at face value - only to find that they are misleading me about what's up with the ranking process. (I'm not saying GG has misled anybody - I'm talking the emails that are sent by Google support that clearly imply that humans are NOT evaluating sites).

This 201 message thread spans 7 pages: < < 201 ( 1 2 3 [4] 5 6 7 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved