wilderness - 1:49 am on Feb 11, 2013 (gmt 0)
479 requests (an approximate 300 the previous day) for robots.txt on one website, and in a twenty four hour period.
Crawl delay did not help, although that is not the intention of crawl delay.
Since redirects won't work!
Anybody have any suggestions on how to stop this nonsense?