Forum Moderators: goodroi
Welcome to WebmasterWorld! Blocking bots is a personal decision since every site is different. Some webmasters love the search engines so they want to allow googlebot, yahoo's slurp and msnbot to crawl thei site. Others see search engines as a waste of time and bandwidth so they may want to limit or block them. Another example is the ia_archiver bot. This creates snapshots of your site through time. Some people like this history, others view it as a legal liability.
I would suggest that you look at your log files and see what bots are hitting you and then look them up and decide if you want them to crawl your site or not. You may find this database of the bigger web robots [robotstxt.org] helpful.