homepage Welcome to WebmasterWorld Guest from 54.167.185.110
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Duplicate websites with a robots.txt file to not index one of them?
Can the website still get penalized for it.
Copper




msg:3098870
 11:51 am on Sep 27, 2006 (gmt 0)

I had a query since long. Suppose I have two websites widget1.com and widget2.com having similar content ( read duplicate website ) and if I write a robots.txt file to disallow spiders from spidering widget2.com , Am I still violating Google guidelines.

 

b2net




msg:3099255
 4:44 pm on Sep 27, 2006 (gmt 0)

I'm thinking of something like this. I want to move my site to a new url but don't want any dup penalty. What is the correct way to do it?

jomaxx




msg:3099273
 5:04 pm on Sep 27, 2006 (gmt 0)

In response to the OP, no of course you're not violating their guidelines. They'll never spider the second site and have no knowledge of or interest in what's on it.

That's not to say it's a good idea, though. It seems like unnecessary effort, and any inbound links the second site acquires will be wasted.

Quadrille




msg:3099306
 5:27 pm on Sep 27, 2006 (gmt 0)

Agreed. Much better to make one good site, and 301 the other domain. Why divide your efforts?

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved