Welcome to WebmasterWorld Guest from 54.167.76.176

Forum Moderators: open

Message Too Old, No Replies

Cloaking robots.txt for security reasons?

     

WebBender

7:58 am on Jan 19, 2004 (gmt 0)

10+ Year Member



I could swear Brett mentioned it in passing in a post somewhere. Maybe within the suppoters forum...

Did a search and couldn't find anything. :(

Has anyone cloaked their robots.txt file for security reasons?

By that, I mean a fair amount of script kiddies (or worse) will check the robots.txt file to see what directories are worth trying to get into.

If at all possible, I'd like to provide spiders with the proper robots.txt, but people with a fake robots.txt

Are there any other techniques to accomplish what I want?

TIA

WB

volatilegx

8:58 pm on Jan 19, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I can't remember hearing of somebody doing it, but to do so, you'd have to turn on server-parsing of .txt files with a .htaccess file like this:

AddType text/plain .txt
AddHandler server-parsed .txt

This would enable you to insert a Server Side Include into an otherwise blank robots.txt file. The Server Side Include could call a script which would compare the IP Address of the client with a database of known search engine spider IP addresses. The regular robots.txt file would be served to search engine spiders, and a blank or "safe" robots.txt file would be served to everybody else.

WebBender

1:10 am on Jan 20, 2004 (gmt 0)

10+ Year Member



Thanks. I probably won't end up doing it- but good to know.

WB

creative craig

3:17 pm on Feb 6, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Have a read of this thread:

[webmasterworld.com...]

Its in the library over in the robots.txt forum :)

Craig

 

Featured Threads

Hot Threads This Week

Hot Threads This Month