incrediBILL - 10:29 pm on Jun 23, 2013 (gmt 0)
See, that's the problem with blacklisting is that you're always chasing bot user agents and by the time they've already been on the machine and done their dirty deeds it's too late.
It's an endless game of cat and mouse, complete no-win scenario and a continual waste of time.
I switched to whitelisting in 2005 and rarely ever add new bots to the list but on the other hand, I never worry about anything attacking my site that uses any user agent I know know about because if I don't know about it, it's not getting access.
I don't see why people continue to play the blacklist game because you can only block it once you're aware of it and if I write "XYZABC bot" right now and scrape your site, I can do it because you don't know I exist.
Not only that, the bigger the list gets the longer it takes to process each time because every single line for every user agent must be processed for every page view which is a lot of wasted CPU time over a million page views.