If you are trying to conserve a "crawl budget", I think another option you have is to play around with the Parameter Handling settings in Google webmaster Tools (site configuration > settings > paramter handling).
I just had a similar problem on a very large dynamic site. Thousands of errors were showing on GWT. Google ignored robots.txt exclusions. I set up meta robots exclusions for some directories and X-Robots (exclusion at the HTTP page header) for others.
All this helped, but GWT still kept showing errors (dup titles, 404s, 500s, etc).
The last thing I changed was to block directories via the Parameter Handling. It took a while, but GWT is finally showing very few errors.
None of this is really helping the site, which took a huge hit from Mayday, but that's another story.