Another option is to detect these requests by IP address range, User-Agent, or Request_URI (or some combination of these) and return a 403-Forbidden response. While most of these 'bots are as dumb as a rock, some of them do detect the server response, and some om those will "go away" if they find that their efforts are in vain.
On the other hand, some will just keep at it. The only way to find out is to test.
Some bad-bots start their sessions with a request for a file that should not exist. The purpose is to discover if the server will return a proper 404 response to this request. If so, the bad-bot launches into a long series of requests trying to find a particular admin script -- the script that is used to configure PHP or your database, for example. In these cases, the initial 'non-existent file test' request can sometimes be detected. If you return a 200-OK response, then the bad-bots know that they won't be able to determine the correct script URL by just trying all common variations of the filename, and again, some of them will give up and go away.
Another option, if you have server config access, is to use custom logging. You can either "drop" these bad requests from the access log, or log them in a separate log file. See [
httpd.apache.org...]
Jim