I do like to set the cat among the pigeons...
So...
Double, double toil and trouble;
Fire burn and caldron bubble.
Here goes...
One (of many) dirty little secret of webdev is traffic is much less human much more bot than many/most know or like to admit. The ad networks admit to 20%, a number of academic studies report over 50%.
Further, those studies consistently indicate that as a percentage of site traffic the smaller the site the greater the bot to human ratio, the larger the site the less; graphed from small to large sites the bot percentage drops from over 80% to under 20%.
Similarly the smaller the site typically the less bot aware and less mitigation capable the webdev.
Note: inexpensive web hosts typically only block half and the rest miss a quarter.
Note: the current 4-generations of bots:
1. simple scripts; unable handle cookies, JavaScript.
2. headless browsers, i.e. PhantomJS; can handle cookies, execute JS.
3. full browsers; able simulate human-like behaviours.
4. full browsers; advanced human-like behaviours, random UA, IP.
---may be rotating highjacked computers.
Note: very few methodologies notice this type.
With this as background let's run a scenario: a small site, 100 uniques a day, wholly Google organic.
100 visitors, half are unrecognized bots;
50 visitors, 5% conversion rate;
2.5 conversions a day on average.
Of course on average is not a constant but an statistical calculation over time.
In reality the unrecognised bot percentage fluctuates, the conversion quality of the human traffic fluctuates, etc. So while the monthly number of conversions may stay within a band, i.e. 2.5 * 30 = 75+/-, the daily numbers are susceptible to significant variation.
With an exponentially larger site, 1000 uniques a day, the susceptibility to variation is less in your face:
1000 visitors, half unrecognised bots;
500 visitors, 5% conversion rate;
25 conversions a day on average.
If the first site drops half its conversions: 2 to 3 drops to 0 to 1, which is much more apparent than the second suffering similarly: 25 dropping to 12. Add in simple statistical variation the first is far more likely to see periods of none than the second.
The less trafficked the site plus the greater impact of unrecognised bots on smaller sites the more many reported anomalies may appear. And, as with the Google sandbox of fifteen years ago, not actually exist being but an artifact of other occurrences.