dstiles - 9:50 pm on Dec 7, 2011 (gmt 0)
I would go further with the benefits:
If you prevent scrapers you can prevent a lot of the google site-duplication problems (and possibly even pandalization, but that's just an opinion).
You can also maintain a more secure site by rejecting virus-implanters. Even if your site/server is already secure, an extra string to the fiddle is always helpful.
Otherwise, yes. It is very time intensive to build and, in my experience, time-intensive to maintain. This past couple of weeks I've seen a signficant increase in "new" nasties (approx three-fold), mostly, it seems, from compromised servers.
In theory, if everyone ran virus-proof servers and broadband-connected computers then botnets would not exist. I wonder if it's possible to pursue some of the major server farms through legal channels? A lot of it is under their control, after all. probably not, though. :(
Globetrotter - if you have a linux-based web site (or IIS that implements .htaccess) your work-load is much easier than otherwise.