Whether it is worthwhile to prevent the "good" bots from spidering areas of your site is your own call. If it is worthwhile, then robots.txt is an easy way to do it. If it's not worthwhile, then don't bother.
1) Prevent 404-Not Found entries in log from robots looking for robots.txt 2) Control bandwidth costs by limiting spidering. 3) Control presentation in various SERPs by controlling spidering. 4) Identify good vs. bad bots:
Disallow a 'page' in robots.txt
Rewrite that 'page' URI to a script.
Use that script to ban any bad-bot that fetches the Disallowed 'page' URL.