Here's my current idea. I believe that Google's staff contains more statisticians than any other specialty. The algo is, more and more, driven by statistics and probability. These statisticians watch query data as well as backlink data. That's what jumped out at me while re-reading this patent: backlinks PLUS queries.
Google's statisticians know what queries currently show bursts of interest from the general public. They know what companies are getting navigational queries - and they know when any online business is truly growing in brand recognition. For example, queries like [company keyword] will start increasing if there is a real growing interest. We puzzle over "Update Vince [webmasterworld.com]"? How about defining "brand" by folding data on navigational queries into the ranking algo.
When backlink numbers start growing, then that new "interest" at the webmaster level should be supported by the general population's query data. In other words, if the backlink growth is relatively "natural", then it should show a certain statistical footprint.
If the spike in backlink growth is too far outside that statistical footprint, then Google will take steps to limit the effect of that apparent SERP manipulation.
The statistically normal expectations are, by this time, quite granular and gaining in sophistication. The patent I mentioned in the opening post lists many possible measures that Google can take to determine when patterns are outside the natural range. And they're probably making many others we haven't even guessed at.
This is my current brainstoring area, and it's why I recommend the idea of ATTRACTING backlinks more than "building" them. Backlinks alone cannot create a statistically correct footprint for a growing, thriving website. Even though such a "dummied-up" impression has been a working tool for improved ranking in the past, it's a tool whose future is getting more and more cloudy.