| 10:38 am on Mar 24, 2004 (gmt 0)|
It involves setting up your webserver so that .txt files are served by a dynamic script.
| 10:43 am on Mar 24, 2004 (gmt 0)|
*scratches head* ok.. now I'm really curious.
Why would you want to block those few humans who would think to look?
On occasion I check robots when doing link exchanges if there is no PR on the links page, just to make sure robots aren't blocked from it.
| 11:14 am on Mar 24, 2004 (gmt 0)|
>>Why would you want to block those few humans
>>who would think to look?
You could have sensitive folders in there somewhere. While not a great level of deterent, it will at least make it one step more to find your private stuff.
| 11:19 am on Mar 24, 2004 (gmt 0)|
several huge sites have or have in the past cloaked their robots.txt (including ebay...).
CK out the msg by jd here:
| 1:31 pm on Mar 24, 2004 (gmt 0)|
>> CK out the msg by jd here:
Thank you, I got there useful information.
| 1:00 pm on Apr 3, 2004 (gmt 0)|
I'd also like to stop normal users from accessing robots.txt.
Unfortunately my server provider doesn't allow use of htacces.
If you created a directory called 'robots.txt' and then put an 'index.php' in it, which had a script to determine whether or not the browser has 'Mozilla' or not and then echo different content depending whether it was a robot or user, would that work?