More thoughts on this and I've come up with yet another theory. So Google is looking for the long term view of the web. They want to get rid of crap. So what do they do? They look at those people who are ranking. That is the reason why we notice the pain, we were getting traffic for somewhere. We were doing well.
Now, they evaluate certain niches where scrapers, crap sites, auto blogs abound. They evaluate the top players. If you are somehow not unique enough, you will be demoted, allowing your competitors, much of them may be crap, up the ladder.
What do you do? You will CLEAN UP YOUR SITE. You will make sure you are unique. You will do whatever it takes to improve. Okay, so now you do that. The next time around the algo runs, the penalty is lifted, you are back up and your link profile/authority is restored. The scrapers have been identified, everyone scrambles. The people above who continue with less unique material will now in turn be demoted. They scream murder. They clean themselves up or go out of business. Rinse and repeat.
That means, if you hold top spots on some important search terms, beware. If this algo hits you, then you could get the dreaded new penalty if you trigger it.
And I don't think this is to do with style of writing or anything like that. It's to do with duplication, plagiarism, scraping. They are hitting niches where the heavy scraping tends to be -- that is where I think the "12% of searches" is coming from.
So they are doing this on purpose perhaps? To indeed get some scapegoats to start changing things. They are putting the onus on us to clean our sites. When we are indeed unique enough, they promote us back up where we should be. Those people who were once crap on top won't be there for long, if we adhere to these standards.
It just sucks we have to do so much work to regain what we once had.