Well, I understand Brett's issues but there are lot's of other ways to stop rogue bots than to just turn off access to all robots. I have similar issues to Brett's in that rogue bots can shut down my site for a few minutes (like the WW slowdown a week ago) when they hammer on certain CPU intensive dynamic pages and request them all in just a few seconds, those greedy little pigs.
A simple solution I found to slowing and stopping unauthorized bots was to just limit the number of pages they can download within a certain amount of time, and blocking them automatically (via the dynamic pages) if there are too many page requests within a minute. For instance a human actually reading pages sure as heck can't download 100 pages in a minute and isn't likely to read 100 pages in 5 minutes either, so when that behavior starts I just start serving up error pages unless it's an authorized bot.
For valid bots with a known range of IPs like Google, Yahoo, MSN, Jeeves, etc. I let them thru but everyone else gets errors.
The only problem I run into is good old Google doesn't seem to use crawl-delay, sigh.
[edited by: incrediBILL at 6:12 pm (utc) on Nov. 23, 2005]