Welcome to WebmasterWorld Guest from 188.8.131.52
All my pages were created with SSI and SSI only worked on www.mydomain.com. I used a conditional header in each .shtml file with a meta refresh statement. This is the header for file /example.shtml:
<!--#IF EXPR="$SERVER_NAME!=www.mydomain.com" -->
<meta name="robots" content="noindex,follow">
<meta HTTP-EQUIV="refresh" content="10; URL=http://www.mydomain.com/example.shtml">
[ here the rest of the page ]
This trick works also when SSI is recognized at www.hostingcompany.com/account/ How it works:
The if statement is parsed by the SSI parser. If it is a request to www.mydomain.com, the header is skipped. Otherwise a header is created with "noindex,follow", followed by a meta redirect to your actual page at www.mydomain.com. The 10 in the content line is the amount of seconds before the redirect takes place. I tried 0 seconds, but that didn't work. In that case the Googlebot directly went to www.mydomain.com, forgot the "noindex,follow" robots tag and indexed the page under the first URL. This is the same behaviour as a 302 redirect. By waiting a few seconds the bot picks up the noindex and pages are removed from the index after some time.
If SSI doesn't work at www.hostingcompany.com/account/ as in my case, it works almost the same. The IF statement is not parsed but seen as a HTML comment and the head block is directly pasted in the output.
If your site uses another server side scripting language like ASP or PHP, you could do the same trick by sending a redirect header whenever the traffic doesn't originate from www.mydomain.com.