I work with seasonal sites too where the difference in the traffic between busy and quiet time of the year is more than tenfold. I am sure Google is aware too.
Site/niche classification When Google classifies sites, it knows an average behaviour of certain categories/niches with regards to traffic and links and anything that deviates over x% looks unnatural. For some sites, busy/quiet pattern is natural and the evenly spread link acquisition could be flagged as unnatural one.
Must be data driven and it relies on human greed The issue Google has is that because of the huge volume of data, it has to rely on the some kind of deviations from the link acquisition pattern within the site classification to suspect unnatural links(*). And human greed (greed for links that help ranking) helps Google in this.
Not new - this is how Retail Loss Prevention works too I worked in Retail IT for years and this is not dissimilar the way Loss Prevention works. Take for example, a supermarket. If the till operator has stolen money once (by, lets say, performing bogus return/refund), it is difficult to notice it because bogus refund is masked/hidden amongst all other validly performed refunds.
But what then happens is that when bogus refund once worked, after some time the till operator does it again (and at that point it is still not noticed). And with the time they get braver and braver and they do it again and again in a shorter and shorter period of time in between. And at certain point they suddenly have done x times more refunds than the average of all other operators - when there will be a very little doubt that it is not circumstantial.
This deviation from the average pattern is what then gets highlighted by Loss Prevention reporting system and this is how they get caught.
The same principle is used with links acquisition Webmaster artificially builds a very few links and sees miniscule improvement. They wait for some time and the improvement is still there and no penalty was received. Great, they say... few more would not hurt, and they perhaps see further ranking improvements.... then they say one more would not hurt... and at some point the site link acquiring pattern is beyond doubt totally different to what similarly categorised sites have.
So there are "corridors" of what is acceptable/understandable/expected for a certain site, not just based on site classification by Google, but I am sure on other factors such as site exposure etc.
And then there are False positives (and False negatives) It is clear that the above can produce false positives because the average corridor, however wide, is still the "average" and there will be sites that will acquire links outside this corridor by a completely natural way, or there will be sites that are not quite correctly classified so a different corridor should really apply to them.
Discouraging link building [
webmasterworld.com...] [
webmasterworld.com...] is one way. Webmasters that are involved in SEO will read/will hear what John Mueller or Matt Cutts said and these are the ones that Google wants to discourage. Average Jo Blogs has no idea who Matt or John are and will be dropping links in their form posts just as they did before.. and these are links Google does want to stay because algo is still based on links [
webmasterworld.com...] .
(*)I have ignored other factors that can also tell to Google links may be unnatural such as links from link farms etc. since this thread discusses velocity.