Welcome to WebmasterWorld Guest from 54.145.85.87

Forum Moderators: Robert Charlton & andy langton & goodroi

Message Too Old, No Replies

Duplicate websites with a robots.txt file to not index one of them?

Can the website still get penalized for it.

     
11:51 am on Sep 27, 2006 (gmt 0)

New User

10+ Year Member

joined:May 8, 2004
posts:12
votes: 0


I had a query since long. Suppose I have two websites widget1.com and widget2.com having similar content ( read duplicate website ) and if I write a robots.txt file to disallow spiders from spidering widget2.com , Am I still violating Google guidelines.
4:44 pm on Sept 27, 2006 (gmt 0)

Junior Member

10+ Year Member

joined:Feb 9, 2006
posts:103
votes: 0


I'm thinking of something like this. I want to move my site to a new url but don't want any dup penalty. What is the correct way to do it?
5:04 pm on Sept 27, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member jomaxx is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Nov 6, 2002
posts:4768
votes: 0


In response to the OP, no of course you're not violating their guidelines. They'll never spider the second site and have no knowledge of or interest in what's on it.

That's not to say it's a good idea, though. It seems like unnecessary effort, and any inbound links the second site acquires will be wasted.

5:27 pm on Sept 27, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member quadrille is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Feb 22, 2002
posts:3455
votes: 0


Agreed. Much better to make one good site, and 301 the other domain. Why divide your efforts?
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members