You'd think traffic would be MUCH more random and naturally distributed.
I'm sorry, but this type of statements really irk me.
First randomness is dichotomous, it is either random or it is not random. Then, what is a "natural" distribution, what are it properties of a natural distribution? I think the essence of what you are trying to say is that, by simply looking you can tell that it is not random. To which I say it is extremely difficult to determine if process is random just by looking at it. I strongly urge you to read this article on Gambler's Fallacy [
en.wikipedia.org...] which explains why.
The other point I take from your statement "Much more random" is that you expect to wider swings in traffic. More variability. I find this to be an odd statement in the context of this thread. On occasion when one does see periods of increased variability, and almost every time this occurs, one reads post like "Google must be testing", or "something must be wrong with the algo" or "Google has lost control".
The way I see it is that you website get's a "global" rank, that is a weighted average based on the volume of search per search term summed across all the search terms. This is purely theoretical for us the webmasters as there is no reliable way to accurately measure it, and it is obviously changing continuously. But despite this change it remains relatively stable over time. Except in the case of algo-updates at which time there may be a sudden a significant change.
What this means in term of traffic is, while your rank remains stable and largely predictable (ie: not random) there is a baseline of traffic. Then, in addition to this stable traffic there is a component on random traffic, new users searching for the first time, old users that stop searching, the occurrence of first time search terms, etc... This variability is also, somewhat predictable, in that it remains within certain bounds. Say you have a baseline of 800 traffic, and variable ranges from 0 to 200. Your traffic will then range between 800 and 1000 on any given day, but the exact traffic figure for any given day will be random within that range. The degree of variability is a function of the specific website, and the niche, and it will likely change with algo-updates. You can call this throttling if you would like.
So it would be perfectly normal, if not expected, to see similar traffic patterns year over year for a specific website.