Forum Moderators: goodroi
If they point to different folders, then yes, you would need to place a robots.txt in the root folder for each (sub)domain if you want to place any restrictions on visiting bots. All disallow statements would be from the root folder of that domain.
The robots.txt for each particular one should be viewable if you were to go
http://www.example.com/robots.txt
[subdomain.example.com...]
Also some people have their sub domains pointing to a sub folder of the main domain, so that a request for
www.example.com/subdomainfolder/examplepage.htm
and
subdomain.example.com/examplepage.htm
would serve up the same page.
In that instance I would also place a disallow for the subdomainfolder in the main domains robots.txt to avoid problems.
See my comments here:
[webmasterworld.com...]