The older patents from Google actually mention two possible ways of assessing spam penalties: subtracting a set number (example: -50) and also multiplying by a percentage point (example: end-of-results penalty, aka -950). A penalty could be assessed for a certain period of time - and if you want to call that a "sandbox", you could -- but Google called it a penalty.
At PubCon Vegas many years ago, I was at the courtyard bar of the Renaissance hotel with Matt Cutts and he was describing how Google was assessing spam penalties - mostly manual at the time - and how they planned to automate more and more of the process, including the automated lifting of penalties.
The fact is that Google actively looks for methods of artificially boosting rankings, and if they go beyond a certain level, then Google knocks that ranking back down. And THEY call it a penalty. And I'm darned sure there's a record kept for a domain any time this happens.
Now not everyone who bangs into a ranking loss knows that they are spamming today. Some practices that people kept closely held a few years ago stop being effective - Google catches on. Then they get leaked into more public knowledge. People pick up that "advice" here and there, apply it without testing, and get penalized as a spammer.
Other sites just don't rank well. They never ranked well, so you can't assume that their low ranking is a penalty.
the waves of the new Google.
Tedster , can you throw some deeper insight into what you perceive this to be , in the context of this conversation.
If you check rankings within the Webmaster Tools report, you can see what positions your URLs get impressions for - and it's all over the place. The patterns can often appear wave-like. In addition, people who track lots of ranking data over time see wave-like patterns flowing through the SERPs as rankings churn. These are the kinds of waves that suggest functions you get in some areas of formal math - complex analysis, statistics, matrix integration and the like.
I don't have this nailed down, and I'd need to be near the top of the ranking team at Google to have enough information even to get close. But rankings today often seem to change in wave-like patterns. It's as though the algo is sampling the user results that various ranking formulas generate, and analyzing that user feedback in an automated fashion.
When these wave-like patterns were first noticed, some called it a yo-yo, some called it a sine wave, etc. But it's not as regular as a sine or a cycloid. Whatever is going on here, Google has become statistically very sophisticated. And that's what I refer to as "the waves of the new Google". Today's Google does not simply score URLs as if they had a checklist of factors and they were assigning points for various factors - and then the pages with most points win.
We're on the outside trying to look into a black box we call the ranking algo, trying to figure what goes on in there. And the mental model of a checklist is still a workable approximation a lot of the time.
But it looks to me like it isn't really the way things work. Not with real precision. It's more like the way many search engines worked years ago. Today there is statistical sampling, waves of ranking changes, and at least some automated application of statistical feedback from user data.
And in the midst of these ranking waves, there are website that trip some automated flag - and their previously strong rankings still do get penalties.