TheMadScientist - 9:12 pm on Jul 23, 2010 (gmt 0)
We recently told G to nofollow these links because we spotted some in the index even through they were blocked in robots.txt
Neither of the methods you used will keep the pages out of the index. Google has the URLs and using a robots.txt block keeps them from crawling them, so they take any links without 'nofollow' pointing to it and other information they can relate to try and figure out what the page is about, then often include the page in the index, because they can't find out if the page is important or not by visiting it and don't want to not include a page their visitors expect to find...
There are ways to get the pages out and keep them out, and one of the simplest is to use the removal tool for the URLs now listed, then change the links to point to a single URL that redirects to the page based on the link clicked and have that URL blocked in the robots.txt and the links nofollowed like you do now... If anything shows in the index it will only be the URL of the redirect page and if it's dynamic, you can put a simple 'nothing to see here' page up if a link is not clicked to visit, so a direct visit yields nothing or you could even put a small 'you might be interested in' sitemap on it for direct visits.