Welcome to WebmasterWorld Guest from 22.214.171.124
I think the loss of long tail is more to do with Google's incredible improvements in Adwords relevance matching. - internetheaven
Long tail traffic in most cases relies on internal link juice, this is how it works. - SEOPTI
[edited by: tedster at 7:44 pm (utc) on May 22, 2010]
I belive the reason the clean well seo'd sites are feeling the effects first is that these are the easiest ones for google to digest and update (onto caffine) but doing these first results in reduced positions until the final switch.
Everyone...take a look at you google webmaster tools spider crawl rate setting. You can view it by changing it to custom rate. Do you see it higher? Lower? Just curious since google get's to pick if you have it set to Let Google determine my crawl rate (recommended).Interesting, never noticed this feature because it has never worked (was disabled for my sites, anyways) when I opened the page. It does now. My server can handle something like 100 times what they recommend easily and will probably withstand some peaks way above even that just fine. So, in the spirit of testing I just cranked that setting up +600% - we'll see what happens :)
Are the sites that are rising any different in terms of content to those that are dropping and is that content more "sticky"?
One of the interesting factors people are reporting - if I understand correctly - is that we are not just seeing a one time drop in ranking. Instead, it's a process of dropping a bit, then a bit more, then a bit more - and sometimes the pattern breaks a bit with a return to good rankings for a period.
Knock sites up and down, measure user behaviour in different positions for different searches and gradually a more settled pattern will emerge. This can be achieved gradually with small jumps up and down - like continual A/B testing although it could take weeks, that's anyone's guess.
It also ties in with an infrastructure change which is likely required to collect and store the data, automate movement and measure the effects of each comparison.
Here's the Geek Question: if you were a Google engineer and suddenly had all this extra data storage capacity, what would be at the top of your list to implement into the Algo-stuff you had wanted to do previously but couldn't because of the data limits?
For a few of my older sites (2 years+) these seem to have settled down in that they show the same spot in Google.com all the time and when I check on that one caffeine datacenter IP as well. For some of my newer sites the jumping around continues all the time. I guess that is slight good sign that hopefully things will begin to settle down once and for all.
[edited by: Andylew at 9:12 am (utc) on May 4, 2010]