freejung - 7:47 am on Jul 29, 2010 (gmt 0)
Lapizuli, I love your blind robot analogy.
Carfac, I wasn't suggesting banning things with robots.txt. Others with more experience with very large sites would be more qualified to comment, but I wouldn't think you'd want to do that. After all, it's probably many of those low-traffic pages that were bringing in the traffic before they dropped below the threshold (or rather, before the threshold raised over them).
However, I think Robert has nailed it (not surprisingly) with a really brilliant insight. It's not that you have too much _content_, it's that you have too many _pages_ with not enough unique content _per page_. If you can consolidate (remember to do 301s to redirect link juice) the same amount of information into fewer pages, and also find ways to add more unique information, that might help.
It sounds like your pages are exactly the sort of thing Google was deliberately targeting with Mayday: pages that were ranking primarily because they were part of a big authoritative site, not because that particular page contained a lot of unique information or had a lot of external inbound links.