Welcome to WebmasterWorld Guest from 23.22.17.192

Forum Moderators: goodroi

Message Too Old, No Replies

Robots dot text for all user files?

robots.txt all public_html

   
1:28 am on Jun 11, 2004 (gmt 0)

10+ Year Member



Is there a way to include deny access to all user public_html directories, yet allow access to the main site (to spiders that is).
or do I just have to just tell the users to put a robots.txt file in their public_html directories.
3:35 pm on Jun 11, 2004 (gmt 0)

WebmasterWorld Administrator bakedjake is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Welcome to WebmasterWorld, kj6loh. (That's a callsign I assume - KC8ORV here).

If all users are under a subdirectory, just disallow that subdirectory. For example, if you have:

/users/allan
/users/bob
/users/felix

Then just disallow /users.

7:19 pm on Jun 11, 2004 (gmt 0)

10+ Year Member



Ok, that's simpler than I thought then thanks. But then how will spiders/robots actually read the file? If user dirs are in /home and the www server sees /home/user/public_html, how is a robots.txt file going to be effective? Wouldn't each user have to put in in their public_html dir?
IE.
/home/usern/public_html
index.html
robots.txt
...
7:21 pm on Jun 11, 2004 (gmt 0)

10+ Year Member



Sorry, just a brainfart, I understand and thanks. And yes it is a callsign.