Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: open
Did a search and couldn't find anything. :(
Has anyone cloaked their robots.txt file for security reasons?
By that, I mean a fair amount of script kiddies (or worse) will check the robots.txt file to see what directories are worth trying to get into.
If at all possible, I'd like to provide spiders with the proper robots.txt, but people with a fake robots.txt
Are there any other techniques to accomplish what I want?
AddType text/plain .txt
AddHandler server-parsed .txt
This would enable you to insert a Server Side Include into an otherwise blank robots.txt file. The Server Side Include could call a script which would compare the IP Address of the client with a database of known search engine spider IP addresses. The regular robots.txt file would be served to search engine spiders, and a blank or "safe" robots.txt file would be served to everybody else.