So the new spam will be keyword stuffing of links - that's fandabidozi.
Either a page is spam or it is not - it does not depend on the search terms. Spam pages/sites should be removed from the index entirely not removed from the SERPS by an ad-hoc filter whose parameters vary with search terms.
Duplicate pages within a site should be ignored. Duplicate pages across sites need human intervention before one is arbitrarily removed (possibly leaving the original stolen copy in the index).
Duplicate domains can be detected by algo and be removed - SERPS would improve if Google did this.
One problem with dynamic-spam-filtering/OOP/Bayesian is that it necessarily introduces more discontinuities into otherwise clean (hopefully) algos. By discontinuities I mean lots of if..then..else instead of a*b + c(d + e)
It would appear that Google have gone down this path. This tells me that no one at the plex has a degree in Control Theory.
A few years ago, a minimum wage was introduced in the UK amid much fear it would cause unemployment. However, the value was such that the impact was small. Similarly, if Google set their discontinuities at silly levels they may do no harm but that is the very best that can be said of them.
If this is the future of Google then Google has a bleak future - a pity since I do believe their intentions are good. But it does go to show that dummies in high places are very dangerous.