phranque - 12:54 pm on Apr 20, 2013 (gmt 0)
it sounds like you are describing a 200 OK status code and a lightweight document instead of a 403 for some or all requests by a bot or set of bots.
help us understand more about the problem.
how do you identify a "bad bot" or "bad bots"?
what resources are being requested?
are there any detectable patterns for the bot(s)? (IP, UA, referrer, rate/frequency/timing of requests, etc)
Please, help me to understand, which drawbacks are here.
typically the only requests you care about are actual human visitors and benevolent crawlers.
the "bad bots" should get whatever response takes the least resources.