I'm throwing my vote in for a dup content filter problem.
My take is that Google's algorith for allocating the original author of content is a little off at the moment.
Here are the tests that I've been doing and I'd be interested to see if others are seeing the same.
1) using the old "&filter=0" at the end of my google query url puts my site back at number one, compared to the current >1000 for the target terms.
2) Another thing I've done is started looking for an exact match in google for some unique text (using my company name). 7 sites show up and mine is not one of them. (it is important to use content that appears in the google serp description)
3) If I "filter=0" this search I get 18,000+ results
(very flattering) but that is a lot of sites being filtered out for my content of which, unfortunately mine is one.
4) out of the 7 sites above, one is scraping sites and using nofollow tags on the links to my site.
5) looking at the current top 20 sites that have 'survived' this latest change I noted that none of them have cached content from this scraping nofollow tag, or any other scraping/nofollow sites.
so again, I think this is a dup content filter problem but I also suspect that nofollow is playing a part here.