I'd like to start building a list of bad bots that are hogging up server resources. One idea I found while researching is to create a new directory, and block it in robots.txt. Then put a hidden link to the new directory on my homepage. Something like:
I have a similar setup on sites that are doing fine. I'm using a file that is blocked rather than a directory but it is in many places on the sites and is doing what it is supposed to do. I do not use the
but it is a nofollowed link with onclick set to false.