Welcome to WebmasterWorld Guest from 54.226.133.245

Forum Moderators: goodroi

Message Too Old, No Replies

Robots.txt

     
12:59 am on Sep 1, 2003 (gmt 0)

New User

10+ Year Member

joined:Feb 27, 2003
posts:31
votes: 0


If I have 50 sites operating on a server, do I need to place a robots.txt file in ever domain root folder or can I place it in one central location that can be used by all of the domains? Sorry for the newbie question.
1:59 am on Sept 1, 2003 (gmt 0)

Preferred Member

10+ Year Member

joined:Aug 20, 2003
posts:408
votes: 0


I think you have to put the file in the root of all the sites. Presumably each site is different and would probably want to disallow files/directories based upon their structure - unless each site is identical?
4:26 am on Sept 1, 2003 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Oct 15, 2002
posts:710
votes: 0


You'd need to place it in the Document Root of each site, yes.
8:19 am on Sept 1, 2003 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Apr 3, 2003
posts:1633
votes: 0


If you just want to use robots.txt to ban particular user-agents (and therefore the same file would be ok for each site), then you can, if you're on a *nix box, maintain a single central copy; and create a soft link to it in each directory.

You still have to create all the soft links, but after that you can quickly make one change that will impact all sites.

The Linux command is "ln" to create a link, so if you were in the wwwroot directory of SiteA, you might type:

ln robots.txt /path/to/robots.txt

...where /path/to/robots.txt is your central copy.