We saw throttling YEARS ago, and believe the modern version to be strongly related to [mis-matched / buy-adverse / non-engaged]* traffic. This has come in two very different flavours over the years. A warning before I continue. As has always been the problem here, lack of syntactic / analytic discipline usually derails this type discussion. For example, people do not limit the discussion to only Google organic traffic. This means things become like the Trump / not-Trump discussion (or leave/remain, if you prefer) in that people feel perfectly happy in knowing that the other side are simple cretins who just don't get it. Therefore they can be ignored and/or insulted with impunity.
As a worked example:
WidgetMan sees throttling, on undifferentiated traffic.
ThrottleSceptic says "That's not just Google Organic so your views are invalid".
WidgetMan may not know and certainly does not explain that 95% of his traffic is from Google and the remaining 5% always stays stable.
Even if he did explain, ThrottleSceptic takes the stable 5% as evidence that ALL traffic is throttled so is probably a server/hosting problem.
WidgetMan and ThrottleSceptic then ignore each other, except when baiting or insulting each other. 1) The buzz-cut
I have not really seen this since the Caffeine infrastructure was launched. It's the originally observed throttling, whereby you cannot get more than X visitors in a time period.
Mechanism could be simple or complex, but per Occam, let's go with simple. Referrer budget set by some over-arching non-keyword score. Site reaches limit, it gets dropped a few places and loses most click-thrus 2) The fuzzy line
This is really, really hard to convince other people with. Basically, your referrals are always in a tight band. An early spike means a late dip. I originally saw this on weekly
traffic numbers. A really good week would be killed by a statistically improbable Friday, leaving a +/- 3% number. The fuzzy line has another significant feature that is worth noting. If your traffic is down
, you get a boost. For example, I would get a "normalised" bank-holiday Monday, where traffic was close to a regular Monday, rather then the weekend norm as with our non-G traffic. Also, we would have a strong Tuesday-Friday, meaning the week view would be within normal variance. **[But see edit below]
Mechanism seems to be more complex. One method would be where site / page(s) approaching
limit, gets dropped a rank or two. CTR drops, but is not eliminated, so little in the way of a smoking gun. Getting a bit tin-foil due to the processing required, theoretically Google could calculate a "glide-path" and tweak your ranking a position up or down to keep you in it. That way, there would be very little way of detecting throttling, except by outcome.
Also, the fuzzy line, in sharp contrast to the buzz-cut, seems to not apply to the whole site. I don't know if it is throttled by page or keyword or something else, but we found we could get peaks in different areas at different times. Ok, but why...
It's been mentioned by someone else above, but might this be Google trying to be fair
With 1,000,000+ results for any given search, a winner-takes-all approach where the sole #1 occupant takes all the traffic seems, well, stupid. Why not rotate the top spots to share the love. For a start, this makes it more likely the putative "winner" can cope with the traffic- far from a given if an update pushes a smallish site unexpectedly to #1.
If you are dispositionally opposed to the idea of Google being "fair" there is a data-collection angle. Rotating results allows Google to collect relative performance data. For example, CTR data, and bounce-derivative (ClickBack-ReClick, ClickBack-Refine, ClickBack-NewSearch; not raw bounce) behaviour for multiple sites in a controlled environment. Presumably, the relative aggregate behaviour can be used to refine the SERP, or refine user-intent, or even as some sort of ranking factor for the rotated pages. I do A/B testing and I don't have the Big Data analytical power, nor legions of doctorate-level boffins to interpret that for me. Google does. *Avoiding the Z-word so as not to derail this thread
**ETA - I believe my data, but see for example "reversion to the mean" as to why spikes/troughs often disappear when "zoomed out"
[edited by: Shaddows at 3:42 pm (utc) on Nov 30, 2016]