This sounds like something better handled via .htaccess. Robots.txt is not going to help against those SEs which already found the URIs as they will keep hitting their already collected info to see it the link still exists. Feed it a 410 (gone), 404 (not found, ie, do nothing) or 301 to the page you want the SE to find.
This assumes that the bad URIs are NOT THERE. If they are, then WHY?