Forum Moderators: goodroi
I have searched this site and the net generally but can't find the answer I'm looking for.
Basically due to an addon domain, search engines are caching my addon domain (and its sub domain) twice.
This is what I have within one hosting package:
Primary domain: maindom.org
Addon domain: addondom.org
Subdomain at addon: subdom.addondom.org
I have robots.txt in all three domains as follows:
User-agent: *
Disallow:
However the google cache shows content as follows in respect of the addonon and sub domains.
maindom.org/addondom/ ....
addondom.org/ ....
maindom.org/subdom
subdom/addondom.org
This is obviously due to the fact that both addondom and subdom are folders in the public root of maindom.org
What I want is the bot cache only showing:
addondom.org and
subdom/addondom.org
not caching the folders representing the addondom and subdom. The addon and subdom were created using cPanel.
I'm guessing the answer to my question is to change the robots.txt file in maindom to read:
User-agent: *
Disallow: /addondom/
Disallow: /subdom/
And leave the existing allow all robots.txt in the addon and subdom folders unchanged.
However would this change stop bots crawling the addon domain (and its sub domain)? I don't want to change the robots.txt file if this does happen.
Many thanks for any help.
Alex
Bots don't know your folder structure, so can't work out that those folders are the same as a domain.
For a particular domain or subdomain, it will only compare the URL's against a robots.txt fetched from the root folder of that (sub)domain.