Welcome to WebmasterWorld Guest from

Forum Moderators: goodroi

Message Too Old, No Replies

How to avoid double indexing



10:10 am on Oct 30, 2007 (gmt 0)

5+ Year Member

Hi everbody

I have 2 domains and my website can be accessed with both addresses and www & non-www versions.

Now I only want www.domain2.com to be indexed by SE, because it has the best PR.

So if I create a robot.txt file with following content, will that work?

User-agent: WebCrawler
Disallow: [domain1.com...]
Disallow: [domain1.com...]
Disallow: [domain2.com...]

I tried redirets and mod rewrited before, but I don't really care if people can access the site differently... ( only care about SE indexing )

Hope somebody can help me with this

2 everybody


12:46 pm on Oct 30, 2007 (gmt 0)

WebmasterWorld Senior Member jdmorgan is a WebmasterWorld Top Contributor of All Time 10+ Year Member

The format you show above is not supported by robots.txt. Records in robots.txt specify only local URL-paths, and including the protocol and domain is not supported.

The correct solution to this problem is to use a 301-Moved Permanently redirect to redirect all non-canonical domain variations to the single canonical domain. This is a popular subject, and searches on WebmasterWorld for "domain canonicalization," "canonical domain," and "www non-www domain" will turn up hundreds of threads with discussion and code examples.

Leaving your site as it is now will result in PageRank and link-popularity being 'split' across the multiple domain variants -- in effect, making your site compete against itself in the search results. Attempts to promote more than one domain may even result in search engine duplicate-content penalties -- if the promotion is too heavy or the domains too numerous.



4:43 pm on Nov 13, 2007 (gmt 0)

5+ Year Member

or you can edit your meta tags and not index the pages .
although that is the long way around.

Featured Threads

Hot Threads This Week

Hot Threads This Month