Welcome to WebmasterWorld Guest from 22.214.171.124 , register , free tools , login , search , pro membership , help , library , announcements , recent posts , open posts Become a Pro Member
Cloaking robots.txt for security reasons? WebBender
I could swear Brett mentioned it in passing in a post somewhere. Maybe within the suppoters forum...
Did a search and couldn't find anything. :(
Has anyone cloaked their robots.txt file for security reasons?
By that, I mean a fair amount of script kiddies (or worse) will check the robots.txt file to see what directories are worth trying to get into.
If at all possible, I'd like to provide spiders with the proper robots.txt, but people with a fake robots.txt
Are there any other techniques to accomplish what I want?
I can't remember hearing of somebody doing it, but to do so, you'd have to turn on server-parsing of .txt files with a .htaccess file like this:
AddType text/plain .txt
AddHandler server-parsed .txt
This would enable you to insert a Server Side Include into an otherwise blank robots.txt file. The Server Side Include could call a script which would compare the IP Address of the client with a database of known search engine spider IP addresses. The regular robots.txt file would be served to search engine spiders, and a blank or "safe" robots.txt file would be served to everybody else.
Thanks. I probably won't end up doing it- but good to know.
Have a read of this thread:
Its in the library over in the robots.txt forum :)