I have a site with several domain names pointing to it. I am going to submit it to the search engines under the main URL, but I want to make sure that if crawlers find the site via links to the other domain names it is registered under, I won't get banned for "flooding" search engines with many URLs for one site.
I am going to write a robots.txt exclusion file to address this issue. I believe the right format should be as follows (the URLs listed are not the real ones, just examples):
What you need to do is to split those domains out, and give each one its own robots.txt. This can be done, even though they all go to the same hosting account. Then simply disallow all robots from those domains you don't want indexed.
The method used to do this will depend on your server, e.g. Apache or IIS. It's fairly easy with Apache if you can use mod_rewrite.