I am faced with this situation:
The asp site has a main domain and a sub domain providing the same content. The sub domain is needed to provide particular affiliates links coming in with a page that contains the appropriate 'buy through this network' information... other than that, the pages are the same as on the main domain.
If we put a disallow on the subdomain robots.txt file could Google not 'find' and index the dup content via external links?
We could put a disallow on the subdomain robots.txt file AND rel canonical tags on each of the sub domain's pages showing the original content as being the main domain's pages - would that be the way to do it?
Thanks for helping :-)