---- Pages are indexed even after blocking in robots.txt
not2easy - 8:13 pm on Aug 31, 2012 (gmt 0)
use a noindex robots meta tag instead of robots.txt rules
And keep pages you do not want indexed out of your sitemap. Google will index pages it finds through internal and external links as tedster says so it helps to "nofollow" links to pages you do not want crawled, but the metatag on the page itself will prevent indexing. If the URL is in your sitemap, the page will be crawled.