Suppose I have a.domain.com and b.domain.com, I'd like to block only a.domain.com from Google. Is this possible? if yes, what is the right way to do it.
Lord Majestic
1:07 pm on May 3, 2005 (gmt 0)
Yes it is possible because every subdomain will be checked for its own robots.txt located in its root. Therefore in order to block all URLs from a.domain.com then you just need this robots.txt located in its root:
User-agent: Googlebot Disallow: /
Pat1975
8:47 am on May 4, 2005 (gmt 0)
Thanks for the answer. So, what if I wish to block SEs from all subdomains. Should I place the robots.txt in all subdomain or just the main domain?