Is this what you have in your actual robots.txt or the 'English Version' of what you have?
How can I remove my pages from cache and my site from google?
Personally, in this situation I would be inclined to let the pages be spidered (remove the disallow from the robots.txt) and replace the content with a custom 404 (or 410) page with <meta name="robots" content="noindex,nofollow,noarchive"> in the <head>.
Leaving it or serving a 404 will work with the removal tool if you decide to go that route. Either with or without the removal tool I would probably personally use a 404 (or more likely a 410) and let the URLs be spidered so there's actually a more recent version of the content (none) at the specific location than what they have cached, but if you use the removal tool it should be gone for 6 months for sure, so it may simply be about personal preference, but IMO if you have the URLs disallowed they may revert to the old cache when the 6 months is up, so I would want to tell them each page is Intentionally Removed (410, Gone) personally.
It takes a bit of knowledge of mod_rewrite or a scripting lang. to serve a Gone error, but it's my preference for pages I have removed the content from, especially over a 404. I guess you could do it if you serve a custom 404 page in PHP and then set the header as 410 Gone on the actual error page, but I haven't tried it, so double check with a Server Header Check tool (like the one in the control panel here) if you try to serve a 410 this way...
Why is it so complicated? Google's way of saying thanks for letting us spider your site and have access to your content in the first place... If you want us to let go of it you'll have to work for it. LOL.
If you remove the robots.txt block and set up a custom 404 page with the meta tag below they won't return the cache or the pages in the results any more, but you will have to wait for them to spider the pages to get it to take effect. It is the simplest, most straight forward way IMO.
Also, for future readers, although you should be able to serve a 410 Gone as outlined above, know if your pages are ever really missing 404 you will be serving a 410 instead, so it's not something to use, unless you know for sure you want all missing (404) pages dropped and not spidered for a longer period of time.
All you really need is to get the SEs the following and all references to the page it's on will be dropped from the results. Personally I almost always serve noarchive on my pages, even when I allow the pages to be indexed and returned in the results.
If those pages are already gone from your server, then the minute you allow googlebot in to spider, it should get a 404 response without you needing to place a custom message or doing any redirect at all.
Then you can use WebmasterTools to remove the pages - or the entire site. either one.
Yeah, what tedster said, and being able to set a custom 404 error page is usually standard in most hosting accounts, so by creating a single page you can serve a cool site-specific 404 page for the visitors (real people) who request a non-existent URL. IMO it's a good way to do things and I use them on almost all, if not all sites I work on.
I usually include links to 'important pages' or directories visitors might be looking for, so it's a single page and can usually be set from within your hosting account.
Do make sure you run a header check when using one to ensure it serves a 404 properly... IMO the issue may have been prolonged by disallowing the content rather than serving a 404 page even without the robots meta tag, but again IMO it definitely has been by not serving a 404 page with the meta tag I posted previously, because as soon as compliant bots get the noindex,nofollow,noarchive tag on a page (URL) and that URL is processed the page is dropped from the results.