homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Search Engines / Sitemaps, Meta Data, and robots.txt
Forum Library, Charter, Moderators: goodroi

Sitemaps, Meta Data, and robots.txt Forum

How to exclude bots from specific domains if all point to same place?
We have foreign domains that all point to the same folder

 11:15 pm on Mar 2, 2004 (gmt 0)

We have foreign sites that all point to the same folder, and IIS then serves up the appropriate language.



We want to keep Google out of the uk and au sites, since the only thing different will be the currency, and we don't want to risk being deemed dupe sites. But my programmers can't see how to do it, since all the domains point back to the same folder.

Just to be clear, if someone comes to the .co.uk site, all pages are served with a .co.uk URL, not redirected to .com.

So we want to keep bots out of all .co.uk and .com.au pages. Are we missing something?



 6:06 am on Mar 13, 2004 (gmt 0)

I would just cloak robots.txt in this case.

Every time robots.txt is requested, you'd have to check which site was requested. If it's either the uk or au site, return the robots.txt disallowing access to the site. Otherwise, return your "normal" robots.txt.

Global Options:
 top home search open messages active posts  

Home / Forums Index / Search Engines / Sitemaps, Meta Data, and robots.txt
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved