Forum Moderators: open

Message Too Old, No Replies

Gooogle New Fair Ranking Algo?

         

AthlonInside

6:37 am on May 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



In the competitive area, ranking seems to be different everyday and among different datacenters. Some sites disappear and reappear on different time and days. Do you think google are giving a fair exposure to most relevant sites so that the top sites won't monopoly the ranking all the time?

NovaW

7:07 am on May 25, 2003 (gmt 0)

10+ Year Member



I would doubt they would ever use that methodolgy. Their interest is in providing the best results for searchers.

The changes you are seeing are because the indexes are spread over 10,000 PC's in 9 datacenters - so you will potentially get a different server each time & not all the indexes are identical on every machine.

AthlonInside

7:28 am on May 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I am not saying different data due to different datacenters, I am saying different data everyday on the same datacenter, esp in a competitive area.

NovaW

7:56 am on May 25, 2003 (gmt 0)

10+ Year Member



I think it's probably just due to hitting different servers on the datacenter & things being in flux.

An algo that rotated sites to be fair to webmasters would involve some judgement on the sites in question and would be incongruent with providing the best algo assessed relevancy to searchers.

Being fair to webmasters is the page on the google site that explains how to act (& of course GG posts here) - beyond that any attempt to serve fairness to webmasters would be a downfall of google.

Visit Thailand

8:01 am on May 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Just an idea but could there be a difference in when the pages and sites were last updated. I am sure Google are trying very hard to ensure all results are the most prelevant and hence up to date.

Could it be that some sites have updated and others not that reflect a difference.

Mozart

10:10 am on May 25, 2003 (gmt 0)

10+ Year Member



AthlonInside and NovaW are scratching at the surface of a question we have not spoken about for a long while now I think:
Their interest is in providing the best results for searchers.

What actually are relevant, good results?

Google more or less says the page needs to be on topic or theme by measuring how often and at which places the keyword was used. The theme of not only this one page but also others on the site may be relevant. Pages linking to this page from other sites cast a vote for the relevance of this page. The linking text they use is an indication of the topic/theme the page linked to has. Those vote-caster-pages themselves may or may not fit into that same theme.

The problem with this and any other approach to finding good and relevant results is that SEO exists. If we were to define SEO as the manipulation to artificially enhance one's pages rankings in the search results, then we have to admit that all the above factors - once known - can be manipulated. The more SEOs manipulate their pages the less relevant the results may get.

IMO this is the major reason why Google and other SE are forced to re-invent their algo every now and then, simply to make this manipulation more difficult. Keep the same recipe for too long and your results become less relevant.

Back to the above question: Could it be that Google has created a new Fair Ranking Algo?

What if Google noticed exactly this, that for the keyword "widgets" there is one in 3 results of the top 20 irrelevant. So instead of tweaking the otherwise great working algo to cater for this anomaly they added one factor of randomness that allows site 20 to be #3 sometimes, site 10 to be #1 sometimes and site 1 to drop to #20 sometimes?

Maybe the research showed Google that in such a highly competitive keyword it doesn't really matter which site is #1 because it may be there only due to manipulation (called SEO) anyway?

Disadvantage for Google: A surfer may find that Google's research is wrong and she wants to always find site 4 at #1 and is very irritated due to this seemingly constant flux. Google would need to monitor numbers of spam reports to find out if this is true.

Advantage for Google: SEOs trying to crack the algo have a near impossible task at hand, as this constant flux was created by purposely introducing a fraction of randomness into the recipe. Google can continue to give out their Do's and Don'ts and webmasters will try to adhere.

Any thoughts on that?

percentages

10:32 am on May 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Simple question! If Google had the best algo and most relevant results before this update, and most seem to think this update has made a major change, then does it still have the most relevant results today?

For all the Google lovers out there you are now faced with a tricky conundrum.....either it was wrong before, or it is wrong now....can't have it both ways as almost everyone seems to agree the results are not consistent!

mat

10:39 am on May 25, 2003 (gmt 0)

10+ Year Member



Again, it's still a moving target. It seems obvious that some big switches were flipped last night, and, for me, the results are far better as a result.

As GG himself has said many times, stage one was getting the new (ground-up rebuild) indexes to all data-centres and then, over the next few weeks, filters will be applied.

A quick scan seems to suggest that the filter they applied last night seems to have hit the cookie-cutter recip-link sites.

Google: reports of my death have been greatly exaggerated.

Yidaki

10:40 am on May 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>almost everyone seems to agree the results are not consistent!

I'd say *not yet* consistent.

>either it was wrong before, or it is wrong now....

So there'd be a third option:

everything will be fine *after* google finished all tweaks

HitProf

10:44 am on May 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



It's a very interesting thought Mozart.

I don't think what you described has actually happened yet, but it sounds like a good experiment for Google. Didn't I read somewhere that users complain about seeing the same results over and over again?

If the serps fluctuate a little (not too extreme) each day that might actually benefit Google's users. Regular users want to find new sites, not the same pages/sites over and over again. They can bookmark the top sites if those have their interest. They use a search engine to find *new* sites on their favourite subjects.

So from a users point of view I think it's a good idea. As an SEO I'm not so sure :)

Sidenote: I don't think spam reports are the best way to measure user satisfaction.

wanderingmind

11:01 am on May 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I agree. For competitive keywords, the top 10 results are not going to make any difference to someone looking for information, and a randomness factor will scuttle the SEOs while will be hardly noticed by users.
But whether google has actually gone ahead and done it, we do not have enough data yet. Don't think we will get that out of Googleguy either ;)

philipp

11:22 am on May 25, 2003 (gmt 0)

10+ Year Member



Mozart, I wouldn't define "SEO as the manipulation to artificially enhance one's pages rankings in the search results". At least in the long run, this is the exact opposite of SEO. SEO should almost be defined as ensuring a site does _not_ artificially manipulate to enhance rankings. Everything that doesn't serve the end-user is not good SEO, and might be penalized sooner or later.

percentages

11:23 am on May 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



>So there'd be a third option:
everything will be fine *after* google finished all tweaks

What does that option mean? Google is temporarily broken so people should use a different SE until Google delare it is fixed? Tell me that and I'll ask when exactly will it be fixed so that I can come back and use it again?

Take too long fixing it and I might not come back of course;)

Yidaki

11:35 am on May 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>Google is temporarily broken so people should use a different SE until Google delare it is fixed?

Those people who complain about the current index quality (mostly webmasters) should in fact better use a different se until google is finished - it'll save them a lot of nerves.

Jane/Joe Surfer will change her/his preferences anyway if she/he is not happy with the results. ;)

Google is a great free traffic source since some years now. Me personally, i absolutely don't see why i should give them a boot if they have a problem or don't satisfy all webmasters for some period. Don't you think, we should give the google techies their time to finish their changes?

jtbell

11:52 am on May 25, 2003 (gmt 0)

10+ Year Member



An algo that rotated sites to be fair to webmasters would involve some judgement on the sites in question and would be incongruent with providing the best algo assessed relevancy to searchers.

Not necessarily.

In "real life," when I'm not fooling around with my Web site, I teach science. One of the major topics we cover in lab is how to analyze random experimental uncertainty. We can never measure any quantity exactly. A good scientific measurement always includes estimating the uncertainty ("plus or minus" value) that goes along with it.

Google's algorithm presumably comes up with some kind of final score that determines the position in the SERPs. There has to be some kind of uncertainty associated with that score, and differences smaller than that value are simply not meaningful.

Just from the point of view of providing a ranking list, it would be reasonable for Google to add/subtract a randomizing factor on each search, based on the uncertainty. That way, if a bunch of sites at the top were close enough together to be effectively equal, each would randomly get turns at the very top of the SERPs.

[edited to clarify the wording a bit]

[edited by: jtbell at 11:55 am (utc) on May 25, 2003]

percentages

11:53 am on May 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



>Don't you think, we should give the google techies their time to finish their changes?

Personally NO! and my traffic from Google is up since this fiasco!

If my favorite physical store stocks a load of junk I have no hesitation to shop elsewhere, they don't get a break because they used to be a good store....if they don't cut it today they are out....it's a dog eat dog world!

Google should have made and tested their changes before launching this index on the World. Google claims it did in fact do that, so we have to assume that Google thinks these results are good.

Indeed Jane/Joe Surfer will go elsewhere if they disagree....so whatever webmasters feel about this update kind of gets balanced out. If Google screws up it will lose users and webmasters will ultimately not care, because those same users will be available via another source!

mat

12:03 pm on May 25, 2003 (gmt 0)

10+ Year Member



'Load of junk' ... 'fiasco' ... there are so many of these emotive threads that everyone gets weary of them, but some things can't just be left out there, unchallenged.

There are a few loud voices - there always are, understandably - but there just isn't the groundswell that you presume backs up your statements. I see a very public change taking place, but, when the dust settles, is it really that drastic? I don't see it.

Google, at the moment, is pubescent, and you're demanding that it springs, fully formed, overnight. If you don't like it, then they're doomed. Your favourite store analogy isn't vaild - they may go out of business, but it will take weeks, maybe months. At least give Google the same sort of timescale, then praise or condemn them.

All I have seen is a few days of turbulence, not a funeral parade.

worker

12:21 pm on May 25, 2003 (gmt 0)

10+ Year Member Top Contributors Of The Month



I am not seeing the line up in datacenters that has been mentioned, and I don't see that a big change was made last night.

When I check the 9 datacenters, there are significant differences in the position of my site across almost all of them for multiple keyword phrases.

Also, the differences vary by datacenter and by keyword phrase, so I'm not seeing the 'lining up' yet.

The differences are also very noticeable when checking the www, www2 and www3 datacenters.

Anyone else still seeing differences?

percentages

12:23 pm on May 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



mat, If the day comes that you ever need an attorney to defend you I hope you still feel that people have the right to get it wrong for a reasonable amount of time before they need to get it right.

After all if your attorney gets it wrong on the first attempt you may not be around to see their next attempt!

Google is just the same. If they "practice" in a live environment and get it wrong they are severely hurting some people (not me, but I feel for those that are hurting).

Saying to these folks, give Google time, it may all be okay in a "while" is actually of little comfort. Right now they are trying to decide whether to take out another mortgage on their homes or jump off a bridge....telling them to chill out and give Google some time doesn't really help them.

Yes, you can argue that no one should be reliant on Google, but lots of people are. Do they deserve to suffer while a few techies test their latest theory?

Google treats me well today, and has for the last 3 years. But the day may come when it doesn't, and for totally obscure reasons. Personally I am in favor of reducing it's influence if it has tendencies to test theories in a live environment that are half baked (may cost me today, but will not cost me long term).

mat

12:32 pm on May 25, 2003 (gmt 0)

10+ Year Member



percentages - funny you should use a legal metaphor, as I was close to writing that you seem to be judge and jury all in one. Not wanting that to sound personal, but you saying that G have got it wrong, and G actually being wrong are not the same thing. It's all subjective, and, again, I don't see the massed ranks in the 'wrong' camp.

It's been said by many before, and it'll doubtlessly be rammed down my throat when I have cause to bemoan G, but if a business/man can be close to jumping off a bridge as a result of a slight/short term change in the way G see things, then they're in too tenuous a position.

Yidaki

12:32 pm on May 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>they are severely hurting some people

>Right now they are trying to decide whether to take out another mortgage on their homes or jump off a bridge...

>Do they deserve to suffer while a few techies test their latest theory?

percentages, we had this discussion many, many times here at WebmasterWorld. Why do you think why people jump off a bridge every day? Why do you think the mortgage business is making so much money? Because people make mistakes and often don't look at themselfs to find out what THEY can change to avoid future mistakes!

percentages

1:00 pm on May 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



>Not wanting that to sound personal, but you saying that G have got it wrong, and G actually being wrong are not the same thing. It's all subjective, and, again, I don't see the massed ranks in the 'wrong' camp.

I never said Google had it wrong! I asked the question if they had it right before and now it is different, then how can they still have it right? Either it was wrong before or it is wrong now.....both can't be correct can they?

The argument that it is wrong now, but we should give them time to make it right is flawed. It is like saying Microsoft insists you install a version of Windows that doesn't work, but they might make it work sometime in the future. Announce that to the world and watch what happens to MS shares!

Time is not an option for any company that has a major influence on businesses. They can live or die by getting it right. Sure they can take the odd hit, but only for a short period of time.

Yidaki

1:05 pm on May 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>Microsoft insists you install a version of Windows that doesn't work, but they might make it work sometime in the future.

erm, sorry for going ot, but isn't this what ms does all day long?

>They can live or die by getting it right. Sure they can take the
>odd hit, but only for a short period of time.

hmm ... 150.000.000 searches a day. Given the fact that the quality is sooo poor that they would loose 100.000 searches everyday, they still would have 1.500 days to correct all problems before they'd be really out of business. Not very short period of time ...

[edited by: Yidaki at 1:10 pm (utc) on May 25, 2003]

kevinpate

1:10 pm on May 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



lmao... using M$ as an example in that context was (intended or otherwise) a welcome touch of humor this morn.

Kirby

2:52 pm on May 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The argument that it is wrong now, but we should give them time to make it right is flawed.

IMO, it isnt an issue of wrong before or wrong now. It is incomplete now.

philipp

4:04 pm on May 25, 2003 (gmt 0)

10+ Year Member



jtbell: Just from the point of view of providing a ranking list, it would be reasonable for Google to add/subtract a randomizing factor on each search, based on the uncertainty. That way, if a bunch of sites at the top were close enough together to be effectively equal, each would randomly get turns at the very top of the SERPs.

Yeah, that might make sense. I wrote about something similar (Google and the Evolution of the Democratic Web [blog.outer-court.com]) since some people who link to a certain topic always take one of the first few entries, thereby making it score even better.

AthlonInside

9:04 am on May 27, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Hehe, it is really like, today is my day, so my listing appear, then tomorrow is competitor A day, so I vanish and he appear. On the next day, it is competitor B and competitor C day, so they appear while me and competitor A vanished. Then the following day, it is my day and competitor A day, so we appear and competitor B and C vanished ... ...

stever

9:59 am on May 27, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>Yes, you can argue that no one should be reliant on Google, but lots of people are. Do they deserve to suffer while a few techies test their latest theory?

If they rely on a variable that they are unable to control for their business model, then they don't deserve to suffer, but the likelihood of them suffering is quite higher.

UK_Web_Guy

10:49 am on May 27, 2003 (gmt 0)

10+ Year Member



AthlonInside

Based on your post I am assuming that the changes you made to your site, (as said in the semi-penalty thread) which made your index page re-appear have in fact been switching in & out?

AthlonInside

12:28 pm on May 27, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



UK_Web_Guy,

Yes, index page disappear again after they get the newest cache into all datacenters (around 12midnight when it happens that I was checking). I've been feeding each fresh bot with different versions of my page to test everything, have a version waiting to be roll up and another version waiting to be crawled. I will see if fresh bot can really do anything. At the same time I also observing my competitors. Will post more observation later for google experiments. :) Good luck guys.

This 34 message thread spans 2 pages: 34