tedster - 7:36 pm on Oct 15, 2012 (gmt 0)
The so-called "minus 950" may not be a subtraction at all - not like a minus 30, for instance. Rather it may be a multiplier that is less than 1, applied to the original rank calculations. As such, it is merely a re-ranking mechanism that can be applied in many cases rather than just one "penalty".
This kind of re-rannking is a method that Google mentioned in patents as far back as 2003 at least. It takes a preliminary set of results and re-ranks just that subset (1,000 or less) of all SERP candidates. It re-ranks them either by adding/subtracting a set value or using a multiplier (see New Google Patent - About reranking results [webmasterworld.com].
In the early days of the -950 phenomenon, even though many saw their URLs drop to the very end of the results, others noticed that they dropped a lot less than that, even though the drop was effectively enough to remove all traffic.
This is a very useful method for Google and I assume it's matured somewhat. Here's how I see this mechanism. Many common queries already have an established set of candidates for the final SERP. This is done to speed the results to the final SERP, since computing everything "on the fly" would take a lot of overhead. I'd bet that any query that appears in Google Suggestions, for instance, already has prefigured ranking candidates and their preliminary scores all neatly stored away somewhere.
Now those candidates get re-ranked for the final SERP, and this is where the subtraction/addition/multiplication of the preliminary ranking happens... at the last minute. By doing things this way, the original computational work of combining all 200-plus factors is not lost or obscured, it just gets modified at the last minute.
Now when one of these factors gets changed, it can be removed altogether or it can remain in place but be made more or less severe. This clearly can be done by an algorithm or a manual action.
So what we have, IMO, is not a "new -950 penalty" per se but an old re-ranking method being applied in many contexts, some of them rather new. The re-ranking factor can be removed all at once or modified gradually. So it's certainly not necessary for a URL to climb back up the ranking positions one step at a time.
There may even be many re-ranking factors in place for a single URL. Adding and multiplying are very fast computationally, so folding in a bunch of last minute re-ranking would be not a big deal. And for very high volume queries, even re--ranked results could be cached and just re-calculated on a schedule (hourly or whatever) to save even more computation overhead.
My main point is this. No matter how low you are currently ranking, it is not a cause for hopelessness - especially if you used to rank well. The fact that your URL is still in the result set anywhere at all is a positive sign in a very negative situation.