I am just passing along second hand knowledge, so take it for what it is worth. Here goes...
If I thought that google SUSPECTED I had some shady content, the last thing I would do is block it via robots.txt I would want google bot to see as much of my site as possible so they would know I wasn't up to no good.
I would, if at all possible, serve 410 instead of 404 status for those forum pages. I don't know how to do that at the server level (I only know how to do that at the page level with PHP - not fun if you have lots of pages to get red of).
I think if you block with robots.txt, then googlebot won't see the 404 / 410 status. So I don't know that robots.txt would do you any good.
I think there is a way to (temporarily) remove lots of URLs from the google index now in webmaster tools, so that might be an option as well.
But I think the important thing is letting google see that those pages are truly gone.
wait for others to pitch in here though first.