Shaddows - 9:51 am on Jun 7, 2012 (gmt 0)
diberry, that is very similar to what I saw when I first started to believe [webmasterworld.com]* in a Traffic Shaping mechanism (although our daily variation is significantly more)- except I couldn't see any significant ranking change.
Obviously the problem we all have is data access. I am not about to give anyone access to my logs or analytics, and I really do not like giving out site specifics. Especially as I don't have a Traffic Shaping "problem" - my normal traffic is perfectly fine. However...
It occurs to me that Zombie Traffic is not being seen by webmasters with high levels of traffic, with few exceptions. We get up to 7500 uniques a day, 5000 on weekends. Average is around 6200. I suspect that is on the high side for Zombie Believers.
Now, a significant proportion of my Zombies are on tertiary product ranges. In aggregate, these get "normal" traffic in the low hundreds, spiking to around 1000 during heavy Zombie attacks. We have thousands of products in that "tertiary" grouping, so there is plenty of room to wander, but they limit themselves to a small, frequently changing subset of a couple of hundred products.
Whenever we get extended Zombie periods, a referral shift usually follows within days of it disappearing again. The referral shift is usually of our primary (60% of sales) or secondary (30% of sales) products- not where the Zombies were roaming.
The important take-aways are this. Zombie traffic results in changed traffic- that is either a reprofiling (of us, of searchers, of query intent, or all three) or an algo change.
Anyway, my theory is that the reason we can't pin down Zombies is due to the targets they visit. Here's some figures.
1000 uniques a day is substantially less than 2 uniques per 10 product pages, and 0.5 uniques per 10 indexed pages. In aggregate, it is not statistically significant. Big sites (who are more likely to number-crunch) that have relatively evenly distributed traffic are just not going to see this.
I'm not going to go further into my site metrics, as I have posted rather a lot here. Suffice to say my primary product rankings are robust, in high volume niches. My secondaries are robust but less competative. My tertiaries are long-tail (I prefer midtail). The fact all the chaos is in the longtail means my headlines are unaffected- again, I suspect this is rather unusual.
My working theory is twofold, with an untestable assumption thrown in. The assumption is that Google has layers of profiles.
Fist part: Zombies are an intensive "scanning" method- using a weighted statistical traffic profile to sling mud at a site and see what sticks. It is used for LOW VOLUME sites... or sections of sites where that site has been differentially profiled by nice. Read that sentance again, it's important.
Second part: The data gathered is fed back into multiple metrics. Some are page-level. Some are page-group level. Some are site level- even though the site was only partially sampled.
I really don't want to share more until there's some quid pro quo from others with similarly qualitative data.
1)Google needs to use a certain amount of traffic to get statistically significant results. This minimum level is a higher proportion of small sites' overall traffic, hence more noticable
2)If Google calculates a % of site traffic to use for its Zombie scan, that will be more obvious to those who have a steep distribution curve- and invisible to those with a shallow one.
3)As well as scanning sites, Google could be categorising query intents by showing them multiple sites. Thus, some "Zombies" might actually interact.
4) To anticipate the "netmeg repudiation" that G doesn't "send" anything... For my theory to work, a small, predefined sample would have to be shown different SERPs. Highly focussed A/B testing if you will. Then the rest of the world would see rankings unchanged.
*Please, please read that thread. It was written pre-Panda, which means a significantly different set of members were contributing.