If you have code that rewrites subdomain requests to subdirectories to implement "multiple subdomains on one server," and you wish to use one single/common robots.txt file for all domains and subdomains, then the answer would be to exclude requests for robots.txt from being rewritten to the subdomain subdirectories.
In other words, change the subdomain rewrite code (which was not posted) from something like
RewriteCond $1 !^subdomain-directories/
RewriteCond %{HTTP_HOST} !^www\.example\.com
RewriteCond %{HTTP_HOST} ^([^.]+)\.example\.com
RewriteRule ^(.*)$ /subdomain-directories/%1/$1
to something like
RewriteCond $1 !^(robots\.txt$|subdomain-directories/)
RewriteCond %{HTTP_HOST} !^www\.example\.com
RewriteCond %{HTTP_HOST} ^([^.]+)\.example\.com
RewriteRule ^(.*)$ /subdomain-directories/%1/$1
to exclude any robots.txt requests from being rewritten to the subdomain-specific subdirectories.
Note that you could indeed use an "exclusion rule" above this subdomain-to-subdirectory rewrite if that is what your code was intended to implement. In that case, the proper syntax would have been:
RewriteRule ^robots\.txt$ - [L]
to specify "Do nothing, just quit here if robots.txt is requested."
Jim