My non-technical take is that between the new interface pushing organic down even further and the introduction of the left margin in search results, a portion of traffic that we'd all otherwise get is making use of ads, maps, and tools (to a larger degree).
I see what you're saying to a point, but there are odd things that keep happening with my traffic on one that I haven't had stats on long enough to understand...
In the new layout thread I posted about my traffic on it tanking about the same time as the roll out of the new layout, but today is the best Saturday I've had with it, and it's a long-tail site from the start and it's consistently moved up in traffic throughout the change until last week when things got all 'goofy', technically speaking of course.
My visits are generally trending up, with last week being down, and visit times / page view averages are all over the place and it's so odd I almost think there's a reporting error for the stats, but I can tell you from what I saw today, the long-tail traffic is still going to sites (more today than on any Sat. I've had the site by about 20%) and last week could have simply been an odd week, and I'm sure Google wants more traffic to stay on their site, because that's business, but IDK how long they could keep it that way, because one of the reasons people go to the web AFAIK is to visit websites, not a single website, so I can see where ads and maps and 'stuff' are important for them to display, but I think people who use Google will continue to visits websites, because that's why they call it surfing the Internet, and I think it'll be interesting to see where things go.
What I'm saying about the changes and taking time is fairly easy to understand if you just think: It takes time to gather the necessary data within some systems to refine them and there are a couple systems (or more) it seems they need more time to just plain gather data within before we see the 'final product'.
I guess another way of saying it is we could be seeing the final product 'in the rough' and as data is gathered and processed by the systems the roughness will probably wear off, and
if that's the direction they've gone and what they're doing the system will continue to refine itself constantly, which is a huge move if you think about it, and IDK if that's what they're doing or not, but it's something tedster suggested and he's made some fairly good 'guesses' in the past if I remember correctly, so IMO it definitely could be.
Personalization and AI type rankings both work about the same way, although separate, so it's easy to think of it like 'click based rankings', only with a really complex scoring system...
Simplistically, if you start off with ten URLs on a web page and you don't know which is the most popular or which is the best answer, you might display them in a some type of random order until people start to click on them and 'vote' for each. After you have some data you could begin displaying the ones with the most clicks at the top after a certain 'threshold' is hit. (Say 20 clicks 'gets a site in to the scoring' at the top of the page). The ones with less than 20 clicks could remain randomly displayed below...
Sites within 5 clicks of each other could even switch back and forth until there was a 'more definite better answer'...
The above could all be written into the system from the start, so it adjusts by itself based on the clicks, but thing is you need a certain amount of data before people can see the refinements the system will make to the scoring over time, and until you get the information you need about the specific sites (pages) for the specific system you are running from the outside it looks random, broken and not well refined, but once the scoring mechanisms have the data they need to 'kick in' the system looks 'fixed' and 'refined' and continues to look more 'refined' over time.
Hope that makes a bit of sense. :)
[edited by: TheMadScientist at 9:58 am (utc) on May 16, 2010]