Welcome to WebmasterWorld Guest from 188.8.131.52 , register , free tools , login , search , subscribe , help , library , announcements , recent posts , open posts Pubcon Website
Cloaking robots.txt for security reasons? WebBender msg:675333 7:58 am on Jan 19, 2004 (gmt 0) I could swear Brett mentioned it in passing in a post somewhere. Maybe within the suppoters forum...
Did a search and couldn't find anything. :(
Has anyone cloaked their robots.txt file for security reasons?
By that, I mean a fair amount of script kiddies (or worse) will check the robots.txt file to see what directories are worth trying to get into.
If at all possible, I'd like to provide spiders with the proper robots.txt, but people with a fake robots.txt
Are there any other techniques to accomplish what I want?
volatilegx msg:675334 8:58 pm on Jan 19, 2004 (gmt 0)
I can't remember hearing of somebody doing it, but to do so, you'd have to turn on server-parsing of .txt files with a .htaccess file like this:
AddType text/plain .txt
AddHandler server-parsed .txt
This would enable you to insert a Server Side Include into an otherwise blank robots.txt file. The Server Side Include could call a script which would compare the IP Address of the client with a database of known search engine spider IP addresses. The regular robots.txt file would be served to search engine spiders, and a blank or "safe" robots.txt file would be served to everybody else.
WebBender msg:675335 1:10 am on Jan 20, 2004 (gmt 0)
Thanks. I probably won't end up doing it- but good to know.
creative craig msg:675336 3:17 pm on Feb 6, 2004 (gmt 0)
Have a read of this thread:
Its in the library over in the robots.txt forum :)