The robots.txt file for a site must appear at the URL example.com/robots.txt
All directives in that file apply only to the currently requested hostname.
URLs are used out on the web. Paths and files are used only inside the server. They are related merely by the server configuration.
You will need four robots.txt files: robots-mainsite.txt, robots-thissite.txt, robots-thatsite.txt and robots-othersite.txt in your root folder.
These are the internal filenames used only inside the server. You then rewrite requests for example.com/robots.txt based on the requested hostname in order to fetch the right file.
RewriteCond %{HTTP_HOST} ^(www\.)?([^.]+)\.com$
RewriteRule ^robots\.txt$ /robots-%2.txt [L]
If you have a non-www to www canonicalisation rule you must also add
RewriteCond %{REQUEST_URI} !^/robots
to it otherwise a request for example.com/robots.txt will be rewritten internally, and then the new internal path will be exposed as a new URL (www.example.com/robots-example.txt) back out on to the web by the non-www/www redirect.
Be sure you know the difference between a redirect and a rewrite. Both are coded using RewriteRules.
Rewrites do not "make URLs for files". Instead, the process is "exactly backwards": a rewrite examines the requested URL and then fetches the right file based on that request.
The alternative method is to rewrite requests for robots.txt to instead fetch a single robots.php file internally. Inside the robots.php file you then have a bit of logic that examines the requested URL and then sends the right reply based on which hostname was requested.
It's your choice which method to use. The first method needs mod_rewrite. The second method needs both mod_rewrite and PHP.
[edited by: goodroi at 8:57 pm (utc) on Jan 31, 2013]