lucy24 - 4:18 pm on Aug 5, 2013 (gmt 0)
The Submitted/Index ratio is still the same which to me says google is not obeying the robots.txt directive
The rest of your post seems to say the opposite: that g### IS obeying robots.txt and therefore not crawling the blocked pages.
crawling and indexing are different things
This is an interesting post because others have suggested that the sitemap overrides robots.txt, such that anything in a sitemap will be crawled even if roboted-out. Your experience seems to say otherwise.
Anyway the solution is straightforward. If you want a page to be neither crawled nor indexed, don't include it in a sitemap. If you want it to be crawled but not indexed, give each page a meta noindex header. If google knows that a resource exists, it will index it unless it has been explicitly told not to.