tedster - 3:39 am on Jul 10, 2012 (gmt 0)
@gouri, no, that's not what I meant. I was talking about Google's continual statistics based tweaking of the search results, not their measurement of our websites. They are testing and testing again, overlapping various tests for different types of user satsifaction, and so on.
They use statistical sampling to make sense of all those results, to see which types of results are most pleasing to their users. They are not offering some standard or objective "measure" of where a website should rank.
The result for those website owners who are trying to "track their progress" up the search results can be very frustrating, especially if they are not yet on page one for their chosen query phrase. In many cases there is absolutely no way you can say "I rank #16" or any particular number - because you rank at many different spots for different users.
Webmaster Tools gives you that statistical picture for various keywords - and you'll notice that you rank over a sometimes very broad range of positions. If you are on the first page for a relatively competitive query, you may find things more stable and then it's somewhat accurate to say "we rank at #5 today" or something like that. But it still can be different between morning and evening at your local time!
Recently I've heard a lot of complaints about the whole domain jamming thing and Google dropping their conventional host crowding scheme. In many cases, people are complaining about the number of results from one domain on pages 2 and 3. Down there in the depths, where impressions are low and CTR even lower.
When I look for the same results, I would say that a good bit oif the time I don't see the same domain repetition. Clearly, more testing and then some more.
I'm also sure that Google has tested query terms and their related SERPs for various times of the day - and any variations we see are the result of their ongoing statistical work.