Forum Moderators: open

Message Too Old, No Replies

Google and SubDomains

How do I disallow almost duplicate subs?

         

HayMeadows

9:24 pm on Oct 23, 2003 (gmt 0)

10+ Year Member



How do I keep Googlebot out of almost duplicate sub-domains?

Example:

www.widgets.com
red.widgets.com

Where www.widgets is our main site, and the only difference with red.widgets.com is that we have different graphics for "corporate" sponsorships.

TIA

ciml

1:20 pm on Oct 24, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I think I'd be tempted to use Robots Exclusion Protocol to deny crawling.

/robots.txt

User-agent: *
Disallow: /

killroy

3:41 pm on Oct 24, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



You could have that in a file called red.robots.txt, then have a reqrite condition on host, and mod_rewrite robots.txt to red.robots.txt if th econdition holds.

This is assuming the apges don'T run on two sets of files.

SN