Whoa - 1:37 pm on Feb 26, 2011 (gmt 0)
It seems to me that if I were at Google and wanted to improve the algorithm, I'd be intently focused on page and site quality measures that cannot be easily gamed. In the past, SEO folks could figure out how to get a page to rank well. But the reactions of the people who visited the page cannot be easily controlled or gamed. The subjective review of each searcher determines whether a page is relevant to their search or not.
For example, after visiting a page via a Google search, does the user immediately return to Google and search again on the same phrase or a close variant? If so, that page and site were not great results for the searcher, and they perhaps should be lowered in the rankings over time for that particular search phrase. For other phrases, the same page could still be ranked high -- if and only if it shows signs of quality for a given search phrase.
There are myriad ways to measure this and myriad complexities to each of the measurements. For example, if you view the time somebody spends on a site after a search as a measure of quality, you can argue it either way. They spend a lot of time on the site could mean it's a great site for what they are looking for, or it could mean the site doesn't deliver relevant answers in a concise and immediate fashion.
Same thing with bounce rates, a high bounce rate on a page doesn't necessarily reflect badly on the page. It could mean it gives the answers people are looking for and they are done -- which is a good thing. Or it could mean that Google is sending people to a page that they shouldn't be for a given search.
So, I would tend to agree with Tedster that this is a new data-driven, learning algorithm that is designed to get better results to the top of the rankings over time.
If you think about it, you really have to shake up your existing rankings to deploy something like this. For example, you need to put what used be a #41 ranking for a search up in spot #3 and see if it performs better than the incumbent #3. For a learning system to work, it needs to test hypotheses that haven't been tested or tracked before...and that means you may have to show some bad results in the top rankings to test whether they are truly bad. It's essentially a massive multi-variate testing machine.
So, in the new era, SEO puts you in the hunt to rank well -- it's a way to communicate to the Google algos after all -- but ultimately the best content will float to the top, regardless of factors that used to be more important (and could be optimized regardless of true quality), such as links, title tags, etc. If you truly have the best page for a given search and have been knocked down, you will likely float back to the top results over time...probably in a month or so.
While I'm frustrated that traffic has dropped on my site, it's a motivator to create better content for users and reflect on how my pages can be more valuable to site visitors. I mean that's always been a motivator, but you can get complacent and fall into ruts, so I guess we all need to make lemonade out of this lemon, and keep on trucking.