I'd like to start building a list of bad bots that are hogging up server resources. One idea I found while researching is to create a new directory, and block it in robots.txt. Then put a hidden link to the new directory on my homepage. Something like:
<a style="display: none;" href="/doNotEnter/"></a>
No human will see the link, and no reputable bot will follow it. Therefore anything that does enter that directory can be scrutinized and perhaps added to a block list.
Sounds like a good idea, I'm just concerned that search engines will see the hidden link to a blocked directory and assume something fishy is going on. Any thoughts?