Is there a way to include deny access to all user public_html directories, yet allow access to the main site (to spiders that is). or do I just have to just tell the users to put a robots.txt file in their public_html directories.
bakedjake
3:35 pm on Jun 11, 2004 (gmt 0)
Welcome to WebmasterWorld, kj6loh. (That's a callsign I assume - KC8ORV here).
If all users are under a subdirectory, just disallow that subdirectory. For example, if you have:
/users/allan /users/bob /users/felix
Then just disallow /users.
kj6loh
7:19 pm on Jun 11, 2004 (gmt 0)
Ok, that's simpler than I thought then thanks. But then how will spiders/robots actually read the file? If user dirs are in /home and the www server sees /home/user/public_html, how is a robots.txt file going to be effective? Wouldn't each user have to put in in their public_html dir? IE. /home/usern/public_html index.html robots.txt ...
kj6loh
7:21 pm on Jun 11, 2004 (gmt 0)
Sorry, just a brainfart, I understand and thanks. And yes it is a callsign.