Welcome to WebmasterWorld Guest from 188.8.131.52
joined:Jan 27, 2003
If it's going to be a long time for pages to be respidered then I would exclude in robots.txt and remove the pages using the URL removal tool in Webmaster Tools.
Finally, a blank page is not a good idea IMO - it may be interpreted as a temporary error and cause Google to keep the cached content for even longer.
Using nofollow is practically worthless as a means of preventing them from indexing the page. All it takes is for one other site to link to the page with a followed link and BAM... it's back in the index.
Using a robots.txt disallow won't prevent Google from showing your page in the SERPs either if enough other sites link to it. While Google won't be able to crawl the page, they can still show a link to the page. Typically, it will appear with only a <title> which they infer from the various link texts pointing to the page and the URL. There won't be a snippet.
Once you have the <meta name="robots" content="noindex"> in place for each of the URLs then you can sign into Google's Webmaster Tools and request a URL removal. I think it generally takes a couple of business days. If it is an absolute emergency, you can request an emergency URL removal. You can request individual pages, folders, or the complete site (not applicable).
Just Google "google url removal request", "google immediate url removal request", "google emergency url removal request", etc.