londrum - 9:53 am on Sep 29, 2010 (gmt 0) [edited by: londrum at 10:00 am (utc) on Sep 29, 2010]
if you think long term, how is google going to know which sites are best when they only give 10 sites to click on?
if the users have nowhere else to go, then the results are going to get pretty stale pretty quick. all the traffic is going to end up in the same place. surely they need data on the sites further down the SERPS so they can work out which ones to rank at the top.
i think google's new ajax-style search is already skewing their data, even without this.
beforehand, if your site appeared in the top 10 and never got a click then we could expect it to go down. fair enough. but now it can appear in the top 10 for a matter of seconds and never even be looked at.
if someone searches for "france train accidents" (a morbid search, sorry) then google will display one set of results when the user types "france" (none of which are relevant to his ultimate query), another set when he gets to "france train" (none of which are relevant to his ultimate query) and a final set when he completes typing the whole phrase.
what happens to the data on those first two sets? none of the sites will be clicked because they're not relevant, but does that act as a blackmark?
how is google to know that the final search term was completly different to the first two? as far as their new ajax-style serps is concerned all the sites shown were relevant, and only the handful at the end interested the user.
i dont really believe that they're going to make this a permanent feature though. sounds a bit crazy.
[edited by: londrum at 10:00 am (utc) on Sep 29, 2010]