You need a <meta name="robots" content="noindex"> in the <head> of each page if you want to prevent Google from showing it in the SERPs.
Using nofollow is practically worthless as a means of preventing them from indexing the page. All it takes is for one other site to link to the page with a followed link and BAM... it's back in the index.
Using a robots.txt disallow won't prevent Google from showing your page in the SERPs either if enough other sites link to it. While Google won't be able to crawl the page, they can still show a link to the page. Typically, it will appear with only a <title> which they infer from the various link texts pointing to the page and the URL. There won't be a snippet.
Once you have the <meta name="robots" content="noindex"> in place for each of the URLs then you can sign into Google's Webmaster Tools and request a URL removal. I think it generally takes a couple of business days. If it is an absolute emergency, you can request an emergency URL removal. You can request individual pages, folders, or the complete site (not applicable).
Just Google "google url removal request", "google immediate url removal request", "google emergency url removal request", etc.