I've been working hard on fixing up the errors in WMT.
I've been using a combination of robots.txt, 301 redirects and noindex, follow.
The errors are shrinking daily, but of course, the access denied messages are increasing. There are millions of pages either blocked by robots.txt or via noindex.
I have removed links, to most of the sources of the errors, as well as the links to profiles, which have been noindexed.
However, certain links still exist. For example, we have a report button on each download page, for users to report content. A unique ID is assigned, when the link is clicked, and the next time Google visits it, access is denied. I'm not sure how to get around this atm. (hide it from Google? :op)
All of the pages I've blocked, are similar to this, junk index.php? type links.
Anyway, the big question is, could blocking such a huge volume of pages be an issue? Making such changes always makes me anxious.
Logically, these changes seem in-fitting with Panda but still, I'd like a second opinion.
Thanks :)