Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: goodroi
test.domain.com is a mirror dev site. It's not linked to from anywhere but google has indexed it. For fear of dup content, I don't want Google to crawl it.
How do I exclude the test.domain.com? Do I block it using the robots.txt from the www.domain.com (which is on a different server)? Or do I block it from the robots.txt from the subdomain of the actual test.domain.com server?
I do have a few other subdomains on the live server, like region.domain.com.
If I block robots from the dev subdomain, will the crawling of my other live subdomains be affected at all?
Am I correct to assume that blocking the root of the test server only applies to that subdomain?
So the above, when placed in robots.txt in the root of the test server whose url is test.domain.com, will only apply to that subdomain.
I know I seem overly paranoid, but I don't want to screw anything up.