Str82u - 5:31 pm on Aug 9, 2012 (gmt 0)
I can say that as a test last month I created a page with a bad link added deliberately, added the page with the bad link to robots.txt THEN uploaded the page with the bad link. Two weeks later both Bing and Google show me the 404 for a page that they never should have seen the link to if they obeyed the robots directive.
It seems like traffic is better when the duplicate titles and descriptions are more under control but I don't think it causes a huge effect on SERPs unless those pages are actually included in the directory. If you do a site:domainname search, do you see those URLs? If not it's probably just an issue that WMT is bringing to your attention rather than something to take action on.