We want to keep Google out of the uk and au sites, since the only thing different will be the currency, and we don't want to risk being deemed dupe sites. But my programmers can't see how to do it, since all the domains point back to the same folder.
Just to be clear, if someone comes to the .co.uk site, all pages are served with a .co.uk URL, not redirected to .com.
So we want to keep bots out of all .co.uk and .com.au pages. Are we missing something?
Every time robots.txt is requested, you'd have to check which site was requested. If it's either the uk or au site, return the robots.txt disallowing access to the site. Otherwise, return your "normal" robots.txt.