Forum Moderators: goodroi

Message Too Old, No Replies

Robots.txt command? "do not search anything outside this."

         

OWizeOne

5:24 pm on Jan 29, 2007 (gmt 0)

10+ Year Member



Hi all,

I was woundering what is the best way to use robots.txt within a directory(folder actualy) and have the bots index only the files within that folder and nothing else (outside)?

any solutions?

goodroi

12:55 pm on Jan 30, 2007 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



generally you want to put the robots.txt in the root and it should be used to allow or disallow directories. so if you want to use robots.txt for your situation you should list all the directories you don't want it to index. i like to also use htaccess to block access.

cheers

OWizeOne

9:09 pm on Jan 31, 2007 (gmt 0)

10+ Year Member



Thank you goodroi,

I was afraid of that, this will couse a lot of truble since i'm adding new directories all the time and don't want to edit the robots.txt everytime i do so.

mabe you can suggest a better solution...

My situation is this, i am using 1 hosting account to host a few domais...one domain resides in the root and the others in seperate directories. each domain is forwaded to it's specific directory and masked so users coming to the domain (site) only "see that domain. (that works fine and saves me hosting dollars).

the problem starts when i want to use the robots.txt

the situation noe is that google associates the domains (meenining when someone searches for one domain specificaly he gets results with all the other domains too) this happens becouse the robot indexes all the pages together and there is no separation.

i would like to use a robots.txt for each domain (in it's own directory) and limit the bots to index only that directory and nothing "above" it.

is there any better way of doing this?

if i host each domain on a seperat subdomain instead of using a masked directory as a home would that work better?

what is the bet way of doing this? please help

Vimes

3:04 am on Feb 1, 2007 (gmt 0)

10+ Year Member



Hi,

sounds like if you haven't set the domains up as sub-domains, you need to.

example.com/
sub1.example.com
sub2.example.com
ETC..

if this is done then you can set a robots.txt file in each root directory making each file unique.

Vimes.

OWizeOne

1:08 pm on Feb 1, 2007 (gmt 0)

10+ Year Member



Thanks for your advice, i haven't tryied that yet but i will.

Thanks again for your help.