tedster - 8:32 pm on Sep 11, 2012 (gmt 0)
1. Whatever Google is doing here, I really doubt that is intentionally aimed at undermining business for specific websites. Instead it would be a secondary phenomenon - a side effect of something else. Discovering what that "something else" is could provide ideas as to how to avoid it.
2. I know it can take a good bit of work, but raw server logs plus grep can really help zero in on suspect periods. There are also paid raw log analyzers that can make this approach easier. I've built traffic graphs on 10 minute intervals to isolate odd patterns on very busy sites. I've tracked down individual customers with this approach, as well as specific hacking attempts, individual scrapers and bots, and so on. If someone's livelihood is slipping away, it seem reasonable to me to dig this deep into the details.
3. If you think page load times are periodically affecting user experience, try boomerang.js to get that analysis of your actual traffic - it is free.
4. The loss of long tail traffic for specific periods of time can be confirmed or ruled out through keywords in the referrer string.
5. How Google is seeing user intent should also be noticeable through the keyword referrers.
6. With regard to zombie traffic, a lot here cold depend on volume. If traffic and conversions are general low even when they are good, then variations might well be statistically expected and not "caused."
7. Checking what the results page looks like when traffic turns zombie or vanishes might provide a clue. Does a News, Video or Image listing appear and disappear? or move up and down? How about Google Maps or Products?
It is also entirely possible that individual reports here (either traffic shaping or zombie traffic) are not seeing the same thing at all. Analysis could turn up different underlying causes for different sites.