Forum Moderators: goodroi

Message Too Old, No Replies

Subdomains, DNS, IIS

How to handle robots.txt requests for multiple subdomains...

         

jtoddv

9:23 pm on May 8, 2006 (gmt 0)

10+ Year Member



I have a client that has multiple subdomains using IIS as web server. All subdomains have the same content and their purpose is just for tracking. In order to avoid any potential penalities, I would like to robots.txt the dupe subdomains. There can be multiple ways they are handling this and I am getting clarification on which is currently implemented, but I would love answers to both for future reference.

How can I implement a robots.txt on each subdomain if they are directing using DNS? And how if they are just using IIS? Remember, both options would be using the same file set.

I was thinking of maybe writing a script to check the subdomain and possibly adding the Robots META tag if it was one of the subdomains we wanted not to get indexed.

Thanks,
jtoddv