incrediBILL - 7:20 pm on Jun 22, 2013 (gmt 0)
Bot blocking blacklists are useless as some rogue spiders just generate random user agent strings so you will never have them in your list to start with.
The simpler route, which is a shorter, faster and more efficient list, is to whitelist the good spiders like Googlebot, Bingbot, etc. and allow browser user agents and deny everything else.
Less maintenance and much quicker for the server to process on every page instead of that large linear list of useless junk.