Forum Moderators: open

Message Too Old, No Replies

Can/does Google automatically test for relevancy?

Measured by recording click-throughs...

         

Umbra

8:29 am on Nov 27, 2003 (gmt 0)

10+ Year Member



Theoretically, Google should be able to automatically determine how relevant are the search results.

Basically, Google records every time a user clicks on the top 10 rankings per keyword phrase. After the next update, they do it all over again, and then they compare the two sets of statistics. If search results are less relevant for a keyword phrase after an update, then overall less people will click on the new top 10 rankings, and Google can recognize that the algorith was somehow flawed.

Google would, of course, measure the standard variation, so they would know if there's a significant difference between relevancy of different search results.

This would only be an estimate of relevancy, but it's more methodical and objective than the personal opinions of Google staff or webmasters.

I can only imagine that such a large company -- with numerous powerful data processors and leading edge click-through software (Adsense and Adwords) -- should have no problem with this sort of measuring and number-crunching. Optionally, Google could spread the program across certain times of day, datacenters, and keyword phrases in order to reduce the processing power required.

If this was done, Googleguy could say he has the hard statistics to prove that Google's search results are more relevant than ever, instead of the vague diplomacy which doesn't seem to assuage those disgruntled webmasters.

I am certain that Google has already thought along these lines. Unless a) my logic is somehow flawed or b) somehow it's not technically feasible, then c) they've already done it and the data is inconclusive so far, or Google already knows the answer -- one way or another.

Chndru

9:32 am on Nov 27, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>b) somehow it's not technically feasible,
They been doing occasional click tracking for a long time, especially after updates. Most recent one here: [webmasterworld.com...]

Dave_Hawley

10:06 am on Nov 27, 2003 (gmt 0)



Basically, Google records every time a user clicks on the top 10 rankings per keyword phrase. After the next update, they do it all over again, and then they compare the two sets of statistics. If search results are less relevant for a keyword phrase after an update, then overall less people will click on the new top 10 rankings, and Google can recognize that the algorith was somehow flawed.

I'm probably missing something here but, how does the number of times any given link is clicked prove it's relevancy? I doubt the sufer even knows whether it's relevant or not until they have clicked and downloaded the page.

Dave

snookie

12:11 pm on Nov 27, 2003 (gmt 0)

10+ Year Member



no no no,

Dave is right. You can't tell if a link is relevent by the number of clicks it gets...

I think it is used to determine the probability of someone clicking on a link. Checkout the stanford paper on pagerank...

snookie

Umbra

2:11 pm on Nov 27, 2003 (gmt 0)

10+ Year Member



I'm probably missing something here but, how does the number of times any given link is clicked prove it's relevancy?

Let's say you're looking to find a Widget Store and you type in that phrase, and the top 10 rankings appear to be full of news articles, forum posts and other irrelevant results. As a result, you're going to do less clicking. Overall, there will be less people clicking on any top 10 ranking for the phrase "widget store" (assuming they have the same average expectations).

This does assume that users behave like so: type in a search phrase; browse the results one at a time and read the snippet/description; if the snippet seems relevant, then the link is linked on. That describes my behaviour on search engines, because I don't have time to open every link in the top 10 or 20.

I admit that this doesn't work for search results which appear to be relevant, until you click on it and find a completely different website than you expected.

OK, although I did use the word "prove" once, I did initially admit that it was an estimate of relevancy.

killroy

2:22 pm on Nov 27, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Erm wouldn't it be exactly the oposite? If there is nothing relevant in the top 10, the searcher had to click on them all to find out, them move to page 2 and ocntinue... he produced 10 clicks...

On a relevant SERP, the searcher clicks one and finds what he was looking for...

SN

Umbra

2:33 pm on Nov 27, 2003 (gmt 0)

10+ Year Member



Erm wouldn't it be exactly the oposite? If there is nothing relevant in the top 10, the searcher had to click on them all to find out

Seriously, is it really true that all of you are blindly clicking on the first 10 listings, without first reading the snippets/description in the search results? Do you really have enough time in the world to click on every single top 10 link for every single search you run? I just assumed that search users had a more critical eye than that.

then move to page 2 and ocntinue... he produced 10 clicks...

Aaah, but 10 clicks produced from page 2 (or ranks 11 to 20) would be recorded differently than clicks on page 1 (or top 10 ranks) thus Google could distinguish that.

snookie

3:07 pm on Nov 27, 2003 (gmt 0)

10+ Year Member



I find it rare that the descriptions provide a useful snippet of the page contents, especially on google these days. Infact I have a great query to show you this. Type in nirvana ringtones on google. Results: 1) is spam, 2) relevent, 3) relevent, 4) spam, 5) relevent. However, you can't tell these pages are spam (I call them spam because they don't provide content I'm after - Nirvana Ringtones), until you've clicked on the link.

snookie

redzone

6:39 am on Nov 28, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



In here everyone thinks that the "masses" search like we do, and expect results to be at the same level, that we feel they should be.

Truth be told, the masses don't give a hoot if all the top 10 results are 100% relevant, spam-free, and on target..

All they care about is that "one" listing matches what they are looking for. If they didn't get what they are looking for, some enjoy playing "detective", and filtering the results sets, while many just blindly click "next page", until they find that one listing...

I've been at this game since '97, and I don't care if we are talking google/Ink/AV, whatever, the average consumer doesn't get all "wrinkled up" about spam, or 100% relevancy. They are just looking for "one" answer to whatever their question is...

Google is only important to us, because of the amount of traffic it generates, not how relevant it's index is. Inference.com used to deliver links that Ink/Excite/AV/Infoseek couldn't touch. I always used it for heavy research.. Most here probably have never even heard of this engine. It was far more relevant than the "mainstream" players. But, I was always looking for better results, unlike the other 99.9% of the world.

europeforvisitors

6:50 am on Nov 28, 2003 (gmt 0)



I've been at this game since '97, and I don't care if we are talking google/Ink/AV, whatever, the average consumer doesn't get all "wrinkled up" about spam, or 100% relevancy. They are just looking for "one" answer to whatever their question is...

Well, if the average consumer can't find the "one" answer to whatever that question is, he or she is going to be annoyed. For some topics, the user may have to dig through multiple SERPs to find anything that isn't a boilerplate affiliate or vendor page. The problem is obviously most acute for keyphrases that pull in results from both "information" and "commercial" sites ("..." for example), but can even be a problem in purely commercial categories (try searching for "Hotel ..." and you may have to dig through three or four SERPs to find the listing for ...hotel).

[edited by: Brett_Tabke at 11:32 am (utc) on Dec. 7, 2003]

redzone

7:23 am on Nov 28, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



europeforvisitors,

That's where you are mistaken.. YOU are annoyed because this listing doesn't appear on the first page...

The average consumer is thrilled if the listing is there, ranked #1, but isn't going to be annoyed if it isn't... We're conditioned that this is an "imperfect" world.

5 months ago, the Google "groupies" were touting that Google was so far superior to any other search mechanism on the web, that the other's should be put out to "pasture", and the freed up bandwidth and computing power, donated to science, so that we could get on with finding the cure to cancer....

Well, newsflash, now a majority here, think Google has gone "left field" on them... And they can find plenty of examples that don't include their website to prove their point.... Problem is, I can find plenty of examples that Google is still very relevant.. People just want to whine that Google is not relevant "NOW", because their website isn't in the top 10 today, and it surely deserves it's place there.....

SE's change algo's, always have, always will.. What worked yesterday, doesn't work today. What works today, may not work tomorrow.. Stop whining about it, saying Google is "broken", and figure out how to get back in the index, and rank well...

I know we've ventured far off the subject of Google having automatic methodology for relevancy testing, but I tire of the whiners.... Seen that for the last six years also...

Just because you had a top 10 yesterday, doesn't mean that your going to have one (or deserve one) today, or in the future... For every example someone can illustrate how broke Google is, I can find one that conflicts that opinion... That's the way the SE world goes...

LateNight

7:46 am on Nov 28, 2003 (gmt 0)

10+ Year Member



For some it is not a question of unrealistic expectations on being number 1/page 1 - its a question of clean sites being completely decimated on terms that used to rank on. I do not mind being knocked down a few notches - changes and fluxes are expected. However, in my area I see numerous clean and relevant competitors being completely obliberated and replaced with internal government order forms. I am hoping this is a glitch - if not I am hoping for Yahoo and MSN to take a monolith down a few notches.

europeforvisitors

7:51 am on Nov 28, 2003 (gmt 0)



redzone:

I've got plenty of top 10s. In fact, I've got plenty of #1 listings. So I hope your comments about "whiners" aren't directed at me, because I feel that Google has treated me very well on the whole.

But I'm a consumer, too, and I get annoyed when I'm searching for product information and can't find it or am forced to dig down through half a dozen SERPs because of boilerplate pages that don't give me what I'm looking for.

Sure, Google has excellent search results in many categories. But in others, it's still losing the battle against what might be called "duplicate content clutter." Whether consumers are annoyed by having to wade through such clutter obviously depends on whether they encounter it.

redzone

8:00 am on Nov 28, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



europeforvisitors:

Nah, wasn't targeting you.. There's hundreds in the Florida update thread, that my comments were directed at...

I agree Google may have turned the knob a little hard on this update. No SE has the perfect Algo., Google has proven that... :)

The majority of us say that we're unbias, and we're thinking in the best interests of the "Search World", when we state our opinions, but inside we're thinking about traffic to our site.... AV has a fairly good set of results now, but who cares? AV's traffic is minimal in the scope of things.

But if the current update sticks at G, then time is better spent figuring out how to make the most of a "new" situation.. It's a moving target, and that's what most people don't understand or like.. They tune their website for the largest traffic source, and they think that everything should freeze frame, and their top 10's should stick like cement.... :)

snookie

9:43 am on Nov 28, 2003 (gmt 0)

10+ Year Member



"It's a moving target" - redzone

Totally correct! Now what was this thread about? Oh yes, measuring relevency... The only way to find the "most" relevent page on SERP is recording the last page a user clicks on surely... assuming the user doesn't give up!

As for google going downhill, well I think it has for a long time. It is far to easy to spam. My own website is a testiment to this ;)

The Flordia update hasn't moved me anywhere. Infact, if anything I suspect my traffic has increased. I don't believe the update has removed a significant amount of spam. However, that is my opinion.

Disappointed with google results I've been getting I have tried using alltheweb exclusively. However, a toolbar installed in browser gives me such easy access to Google results.

There is now a mindset that if you want to search for something you goto google. Its a brand name in much the same way that Ford and McDonalds are (which goes without saying wouldn't you think?).

People are happy with the google results which is a sad thing, because IMHO google, these days, are like Ford and alltheweb like Audi.