Do you really want Googlebot crawling all of those documents and displaying URI only listings
I don't see Googlebot crawling robots.txt disallowed documents, pageone, At least not routinely, but there is the occasional edge case. They can and do add disallowed URLs to their index, but that's just the URL, not the content. I also don't like that, but it isn't a violation of the robots.txt protocol.
They do crawl documents with a noindex robots meta-tag, however - in fact, that's the only way they can even SEE that type of directive.