tedster - 9:38 pm on Dec 31, 2010 (gmt 0)
There's another totally excellent thread that touches very potently on this topic - Google & Traffic Shaping [webmasterworld.com]. In that thread, Shaddows shares some very evidential analysis, not just about the total amount of traffic but about its quality.
Shaddows: Using a multivariate dataset, across a range of different keyphrases, user intents and user types, Google exposed our site in marginal but significant ways (putting us up one place, dropping Universal search, above or below shopping results, etc). They did this with (at least) four separate sets.
So far, this is the only analysis of traffic shaping that includes a look at ranking changes. If Google is involved with this type of thing, then in my opinio, ranking changes MUST be part of the picture. If rankings stay the same but traffic shows major disturbances, I just can't see laying the cause at Google's feet.
If traffic shaping is real, Google's goal would most likely be to better serve their users, not a program that targets webmasters. Of course, webmasters would still feel the effect, whether positive or negative.
This was my initial reply to Shaddows in that thread:
We talked about 3 very big buckets of intention "informational, navigational and transactional" - although I'm sure Google has a much more refined set of user intention buckets than this. Another user intention could be "locational". There's little doubt that some queries have an implied geographic component.
Here's the missing piece in that analysis. In order to tailor specific SERPs to specific user intentions, Google must also assign each website, and possibly each URL, to a specific taxonomy. Only then would they understand which type of page should be returned to which type of user intention.
It seems to me that Google has cranked up some kind of statistical testing - one that tries out a given page against different types of query intentions, and then takes note of the results. After a while, they could discover which intention taxonomy works best and then make a more stable assignment of website type - and some pages might have more than one type.
Yes, this thread is not the place for complaining.
Serious analysis only, please!
If this is the case (and yes, it is definitely in the area of a conjecture, not something we've proven) then I would expect most sites who get this treatment to stabilize relatively quickly as Shaddows reported. If a site stays stuck in such a pattern, then my guess would be that either Google can't get statistically relevant information for some reason (weak signals of some kind) - or my analysis is just plain wrong - that's always a possibility ;)