But a downfall would be supposing I had searched for a particular keyword and got some sites which I found informative. Next time I am trying to get to that site using the same keyword, there are chances that I wouldn't get them due to the rotating algo.
If G implements some search personalization so that you can create a G account, and then add sites that interest you to your online bookmarks (stored on G server), then it would work out. Then everytime I search for the same keyword, it would have a section like "From your bookmarks", which would then show me relevant sites that I might have bookmarked for the searched keyword.
I have also noticed this.
I have been analysing this since google started updating on Feb 07. I noticed it is still under review, in some data center rankings are updated along with back links.
But some data center has the same old data base, it is almost nearing week since google started process of udates. When it will be finalized? Some times I see updated results and some time the same old results with little modifications.
What do yo think when google will update its main data base with new algo?
I don't think we are seeing deliberate rotation, I think we are seeing different datacenters with a slightly diferent set of results. It's a mountain of data google need to keep in sync, at times, especialy after a major change the data can take quite a while to settle.
A SEPR rotation would not be in good for the average user. A user may search today for “blue widget” and came back tomorrow and search again but the new SEPR will not help him to get the sites he wants. Its confusing.
Mack, I dont think we are seeing deliberate rotation. But the datacenter results are not slightly different, they are vastly different - is my feeling.
Deliberate rotation does not work - at least at first look. Say, for example, I search for 'microsoft'. It would be crazy if the microsoft site does not rise tto the no.1 position in every search. or 'google'.
I think it could work. Of course, the creative minds at Google would need to do alot of Research & Development to develop a scheme which really works, for the user and advertisers.
Like for instance, if a search did come up about Microsoft (or any other domain name) then that site will always land at the top because the keyword "microsoft" was found in the domain.
I agree with Imaster that somehow "bookmarks" would need to be the thing which makes it work. But then we will need to login or not clear our cookie cache just to do a search. Google is about simplicity, there would need to be alot of brainstorming and brain power to accomplish this. It would definitely be interesting to see and Google would re-invent the wheel, again. Maybe they should try it as Beta like an open project, like DMOZ. Call it MiniGoogle.
"What if Google rotated there search results. Meaning: Every time you do a search you never see the same search page twice, the entire search is randomly but ingeniously rotated, excluding the sponsored ads."
Nope. Horrible idea. One of the reasons McDonald's and other chains are so successful is because people like consistency and predictability. You get the same horrible Big Mac anytime, and anywhere you order it. This is why this update is so bad for Google. Many established websites have been sandboxed and people can't find them even when searching domain.com.
Rotation of search results in the manner you describe is an appealing concept (and one that many have pondered) that upon analysis doesn't hold up. Reason #6: It would give a brand new, and undesirable meaning to the Google button "I'm Feeling Lucky".
*very clever :)
Great point BigUns,
I never thought about the "I'm Feeling Lucky" button. I don't use it much because there is no forced need to. But with search page result rotation it might be a useful button. Maybe the "I'm feeling lucky" button can list the Top 10 (or 20) sites, which are optimized.
I think the "I'm feeling lucky" button should take you to the Yahoo SERPS.
'I think the "I'm feeling lucky" button should take you to the Yahoo SERPS.'
Now That's funny!
And if GG reads it, it may help motivate the Plex to finalize Allegra.
And if Microsoft MSN Search reads it, they would be well-advised to implement it!
After some thought on SERP rotation, I realized that optimization would still be a major factor, because how else will the engines determine relevant searches.
This will just redefine the game and re-invent the wheel. No one site can monolopize a keyword any longer, it will be fair playing ground for everyone.
Say there is John, Jack, Jill, and Joan. They all make up the entire web. They each have a widget web site to share (advertise). When John does a search for "widget" he gets Jack's widget page, the next time he searches he gets Jill widget page, the next time Joan widget page. Now if John and Jack simutaneously search "widget" John gets John widget page, Jack gets Jill widget page, they simutaneously search again and John gets Jack widget page and Jack gets Joan widget page....and so on.
Opposed to the current search engine. If John searches "widget" he gets Jill widget page. If John and Jack simutaneously searches "widget", they both get Jill widget page....they search again and they both still get Jill widget page.
The "I'm Feeling Lucky" button can be renamed or the same and used as previously mentioned (BigUns) to retrieve the regular, common, and static results
Or they can leave the current buttons how they are and add one more button titled "I'm Feeling Stupid" for people who want their search results rotated like me. :-0
I seriously think SERP rotation would be fair, incredible, and brillant, it would be dubbed "Search Sharing"...searchshare domain is already registered I already checked "you animals" :-)
I was the one that suggested a "randomization factor" might be a good thing, based on the idea that no search engine is going to be able to make a really effective, objective judgment about what is "best" for a term.
I wasn't thinking of total randomization, which would be terrible, but a "factor" or alogrithm that would work along with existing factors.
It makes no sense to have a site come in at 20,000 spot, and then first. But it shouldn't be that difficult to have some sort of random process within "levels".
And, it needn't be the case that EACH search would result in different results for the same terms for the same person.
I thought google's objective was to provide the best possible results?
if there are 2 diff serps, which one is the best?
I can't see google attempting that.
i too believe we are seeing a settling of data, with new sites being put into the mix slowly but surely.
just my thoughts
When Google first started their mission was the fastest results, then it was best results, now why can't it be "fair results". Make everyone happy. They have the market share to do it.
The "I'm Feeling Lucky" button can be renamed to "Best Results", "Top Results", "Recommended Results", or "Static Results"...that might work~!
|Everyone gets a fair chance. |
Hmmm...seems like a pretty level playing field to me right now. Although I didn't like my site sitting on the virtual sidelines for a year, I can understand the necessity to make the age of inbound links (among other things) affect the serps (seems like a great idea to keep out the spam). I think google should simply change their guidelines for webmasters to include a statement to the effect that it may be X amount of time before your site appears in the index (or something to that affect). Of course, I doubt that would make it past the PR police.
As for established sites disappearing...it's not necessarily unheard of (remember the "disappearing index page" glitch a couple of years ago?). I'm sure most will be back eventually. Googlebot has been knocking on our door almost constantly all day now, so I'm sure other established sites are getting actively crawled as well.
I think people tend to forget to put themselves in google's shoes. They have to do whatever they can to stem the tide of spam in their serps.
What do you think would happen if they simply left the algo unchanged for even 6 months? Not only would most legitmate website owners (at least in commercially viable areas) start losing ground to spammers, but the steady stream of traffic would also be diluted as people grew more and more dissatified with google's serps. If a few sites have to be placed in a virtual "holding area" until they are vetted by the system, then perhaps that's the price we need to pay in order to enjoy all the free traffic. Frankly, I'd rather have 7-10 months a year of free traffic, than none at all because spammers have overidden my sector.
This latest update got rid of a TON of doorway/cloaker/scraper/datefeed trash sites in my sector- which frankly is like a breath of fresh air.
[edited by: WebFusion at 10:14 pm (utc) on Feb. 11, 2005]
"Make everyone happy. They have the market share to do it."
sorry to say but these guys are mathematicians not philosophers. Each page is ranked and you're either on top or aren't.
Alta Vista actually tried this a few years ago.
It had made itself the webmaster's favorite target because of it's market share and instant indexing -- See your site at number one in a few hours!
So it decided to rotate its SERPs to jumble things up a bit. Wound up just confusing its users.
Then Google came along...