Duplicate websites with a robots.txt file to not index one of them?
Can the website still get penalized for it.
11:51 am on Sep 27, 2006 (gmt 0)
I had a query since long. Suppose I have two websites widget1.com and widget2.com having similar content ( read duplicate website ) and if I write a robots.txt file to disallow spiders from spidering widget2.com , Am I still violating Google guidelines.
4:44 pm on Sep 27, 2006 (gmt 0)
I'm thinking of something like this. I want to move my site to a new url but don't want any dup penalty. What is the correct way to do it?
5:04 pm on Sep 27, 2006 (gmt 0)
In response to the OP, no of course you're not violating their guidelines. They'll never spider the second site and have no knowledge of or interest in what's on it.
That's not to say it's a good idea, though. It seems like unnecessary effort, and any inbound links the second site acquires will be wasted.
5:27 pm on Sep 27, 2006 (gmt 0)
Agreed. Much better to make one good site, and 301 the other domain. Why divide your efforts?