|where to place robots.txt?|
robots newbie alert
| 10:14 am on Jun 10, 2004 (gmt 0)|
I've been reading a little about robots, as my bandwidth usage is going thru the roof.
My question is: where do I place the robots.txt file?
On my hosting package, I have folders set up, each of which contains the pages for a different site with the url pointing to that folder.
Do I just need one robots in the root above all these folders, or one robots.txt in each folder for each site?
| 10:31 am on Jun 10, 2004 (gmt 0)|
|On my hosting package, I have folders set up, each of which contains the pages for a different site with the url pointing to that folder. |
what do you mean ..you have more than one url/domain on your hosting package? In which case you need a separate robots.txt for each url/domain. You need to explain what you've got set up a bit more clearly so as not to be steered the wrong way ..
| 10:47 am on Jun 10, 2004 (gmt 0)|
In short, you need to place your robots.txt in the web root. This means that your robots.txt must be available through the URL [yoursite.com...] . If you have more than one domain, you must place one robots.txt in each domain, also virtual domains (more then one web site pr. server)
| 11:35 am on Jun 10, 2004 (gmt 0)|
From the root of my hosting package I have:
/folderABC/<all the files for website1>
/folderDEF/<all the files for website2>
and so on, with different urls pointing to each folder for each different site.
So I think I need to place a robots.txt in each of these folders, ie one per site? or if I place one robots in the root folder above all these, will that be enough?
| 5:11 pm on Jun 10, 2004 (gmt 0)|
No, the robots.txt file is a file that must be available to site visitors like search spiders, that means it has to be at the root level folder of each and every website, not one level above that, you can't access a file through the web if it's one level above the root folder /, if you can't type in yoursite.com/robots.txt and then view your robots.txt file, nothing else can see it either over the web.
| 10:17 pm on Jun 10, 2004 (gmt 0)|
thanks all, will have a try tomorrow with it!