Forum Moderators: goodroi
I was afraid of that, this will couse a lot of truble since i'm adding new directories all the time and don't want to edit the robots.txt everytime i do so.
mabe you can suggest a better solution...
My situation is this, i am using 1 hosting account to host a few domais...one domain resides in the root and the others in seperate directories. each domain is forwaded to it's specific directory and masked so users coming to the domain (site) only "see that domain. (that works fine and saves me hosting dollars).
the problem starts when i want to use the robots.txt
the situation noe is that google associates the domains (meenining when someone searches for one domain specificaly he gets results with all the other domains too) this happens becouse the robot indexes all the pages together and there is no separation.
i would like to use a robots.txt for each domain (in it's own directory) and limit the bots to index only that directory and nothing "above" it.
is there any better way of doing this?
if i host each domain on a seperat subdomain instead of using a masked directory as a home would that work better?
what is the bet way of doing this? please help