Welcome to WebmasterWorld Guest from

Forum Moderators: mademetop

Message Too Old, No Replies

Is a hidden link to a blocked internal page bad?

to catch bad bots



9:13 pm on Nov 27, 2012 (gmt 0)

10+ Year Member

I'd like to start building a list of bad bots that are hogging up server resources. One idea I found while researching is to create a new directory, and block it in robots.txt. Then put a hidden link to the new directory on my homepage. Something like:

<a style="display: none;" href="/doNotEnter/"></a>

User-agent: *
Disallow: /doNotEnter/

No human will see the link, and no reputable bot will follow it. Therefore anything that does enter that directory can be scrutinized and perhaps added to a block list.

Sounds like a good idea, I'm just concerned that search engines will see the hidden link to a blocked directory and assume something fishy is going on. Any thoughts?


9:52 pm on Nov 27, 2012 (gmt 0)

WebmasterWorld Administrator 5+ Year Member Top Contributors Of The Month

I have a similar setup on sites that are doing fine. I'm using a file that is blocked rather than a directory but it is in many places on the sites and is doing what it is supposed to do. I do not use the
style="display: none;"
but it is a nofollowed link with onclick set to false.

Featured Threads

Hot Threads This Week

Hot Threads This Month