Welcome to WebmasterWorld Guest from 54.226.246.160

Message Too Old, No Replies

Duplicate websites with a robots.txt file to not index one of them?

Can the website still get penalized for it.

     

Copper

11:51 am on Sep 27, 2006 (gmt 0)

10+ Year Member



I had a query since long. Suppose I have two websites widget1.com and widget2.com having similar content ( read duplicate website ) and if I write a robots.txt file to disallow spiders from spidering widget2.com , Am I still violating Google guidelines.

b2net

4:44 pm on Sep 27, 2006 (gmt 0)

5+ Year Member



I'm thinking of something like this. I want to move my site to a new url but don't want any dup penalty. What is the correct way to do it?

jomaxx

5:04 pm on Sep 27, 2006 (gmt 0)

WebmasterWorld Senior Member jomaxx is a WebmasterWorld Top Contributor of All Time 10+ Year Member



In response to the OP, no of course you're not violating their guidelines. They'll never spider the second site and have no knowledge of or interest in what's on it.

That's not to say it's a good idea, though. It seems like unnecessary effort, and any inbound links the second site acquires will be wasted.

Quadrille

5:27 pm on Sep 27, 2006 (gmt 0)

WebmasterWorld Senior Member quadrille is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Agreed. Much better to make one good site, and 301 the other domain. Why divide your efforts?
 

Featured Threads

Hot Threads This Week

Hot Threads This Month