Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: goodroi
I want to disallow spiders from various hosts in my domain, like foo.mydomain.com. I have mapped hosts to remote sites in DNS, and have no access to the remote site web server root directory.
For example, let's say I've mapped foo.mydomain.com to images.someotherdomain.com.
The initial thought is to build robots.txt like this...
...But I see nothing in the specification for domains, only directory and file exclusion.
Any ideas or feedback appreciated :-)