This is a rather basic question, but could provide a lot of value to those running their own forums. Search engines will try and index any/all pages possible. On some forums, there are toolbar links within a thread that look like:
I think it's a good idea to prevent "junk" pages from being indexed - forms, odd versions, etc. Normally these won't be well-ranked, but I'd prefer the SEs to focus on the good content instead of scarfing down low-content pages.
You can restrict indexing with your robots.txt file and/or a robots NOINDEX meta tag. The advantage of the robots.txt approach is that well-behaved bots will never even request the page. The downside is that Google, when it finds the link on a "good" page, may stick that URL in its index for future reference. It won't spider it, but it may keep the entry for it. If you put in the NOINDEX tag, that should prevent if from being listed in any way. Of course, the spider has to request the page to see that tag, so you'll still see some of the bandwidth usage.