Note also that regular expression are not supported in either the User-agent or Disallow lines. The '*' in the User-agent field is a special value meaning "any robot". Specifically, you cannot have lines like "Disallow: /tmp/*" or "Disallow: *.gif".
in other words, they so don't want to support wildcarding that they mistakenly refer to it as a regular expression, which they also don't want to support.
note also that grammar are not important to robots...
G and Y! support wildcards, I don't think Ask does as yet, nor most of all those other little pesky critters that flit around all over the net. So, anything blocked with wildcards is going wind up in the wild eventually anyway. I think the best bet is to use the robots meta. Just be sure not to block the subfolders in robots.txt so bots will be able to read and obey the instruction.
<added> And of course, if there are links to the pages in the subfolders, the meta robots will ensure that those pages don't wind up as a URL only listing in the SERPs. </added>