Welcome to WebmasterWorld Guest from

Forum Moderators: goodroi

Message Too Old, No Replies

How to exclude bots from specific domains if all point to same place?

We have foreign domains that all point to the same folder

11:15 pm on Mar 2, 2004 (gmt 0)

Junior Member

10+ Year Member

joined:Jan 29, 2003
votes: 0

We have foreign sites that all point to the same folder, and IIS then serves up the appropriate language.



We want to keep Google out of the uk and au sites, since the only thing different will be the currency, and we don't want to risk being deemed dupe sites. But my programmers can't see how to do it, since all the domains point back to the same folder.

Just to be clear, if someone comes to the .co.uk site, all pages are served with a .co.uk URL, not redirected to .com.

So we want to keep bots out of all .co.uk and .com.au pages. Are we missing something?

6:06 am on Mar 13, 2004 (gmt 0)

Full Member

10+ Year Member

joined:Aug 20, 2003
votes: 0

I would just cloak robots.txt in this case.

Every time robots.txt is requested, you'd have to check which site was requested. If it's either the uk or au site, return the robots.txt disallowing access to the site. Otherwise, return your "normal" robots.txt.