Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

How to deal with duplicate content sites

duplicate contents, seo, keywords ranking

         

johnlim9988

6:50 am on May 24, 2007 (gmt 0)

10+ Year Member



Hi,

About we years ago, we are not very familiar with the essence of seo, and built many sites just according to different categories.

Now we understand that was wrong and there are a lot of duplicate contents. Now we only concentrate on two sites (mainly acturally just concentrate on one site) and write a lot of unique contents on the only two sites.

Now the problem is how to deal with so many former duplicate sites....About 4 months ago, we put robots.txt to those duplicate sites and put nofollow, no index for the robots.

My question is is it enough to just put the robots.txt? will this continue affect our main site?

Thanks.

tedster

6:54 am on May 24, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yes, that's all you need to do to remove a domain from the Google index. The duplication from those sites should not haunt you.

pageoneresults

7:06 am on May 24, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Now the problem is how to deal with so many former duplicate sites....About 4 months ago, we put robots.txt to those duplicate sites and put nofollow, no index for the robots.

Have those sites been removed from the index? Or, can you still find them using site: searches? If you can still find them in Google, you may want to go one step further and follow the instructions here...

How do I use the URL removal request tool?
[google.com...]

Just be sure that you want to permanently remove those references. :)

johnlim9988

8:04 am on May 24, 2007 (gmt 0)

10+ Year Member



Now if I check at google using site:myduplicatedomain.com and no pages were cashed in google already.

But my main site key words ranking still not recover.

I am confused. When will may main site key words ranking come back?

johnlim9988

1:54 am on Jun 3, 2007 (gmt 0)

10+ Year Member



Hi,

After we put robots.txt and ask robots no follow, no index to the duplicated stes. The the search engine like google etc have no index for the duplicated sites now. (If check by site:domain.com)

But now the problem is that our main site with many unique contents still not get good ranking at goole, we are confused and worried.

What should we do to let the ranking of main site come back?

Thanks.

johnlim9988

8:34 am on Jun 13, 2007 (gmt 0)

10+ Year Member



Anybody has any new idea about this? why our main sites still not recover? Thanks.

tedster

4:51 pm on Jun 13, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'd say study the threads in our Hot Topics [webmasterworld.com] section for ideas -- particularly all the stpes for a successful site on Google that Brett outlined in this thread:

[webmasterworld.com...]

Silvery

7:10 pm on Jun 15, 2007 (gmt 0)

10+ Year Member



I have additional recommendation on this:

If those old sites had been around for a long while, they may've developed their own sets of in-bound links from other sites and PageRank.

Merely changing their robots.txt to halt inclusion of them in Google's index would not have transferred their good-will (PageRank) back to the desired core sites. Yes, that would've removed the duplication issue, but it would've thrown the PageRank baby out with the bathwater.

Most-desirable would've been to use 301 redirects from each of those old sites's pages to their equivalent pages at the main domains. That would've effectively transferred all their PageRank to the main sites and removed the duplication issue.

Programatically, I know it can difficult perhaps to make sure each of the old pages is 301ed to the right page at the other sites. If that's the case, it could be sufficient to make all the deep-linked pages redirect to the homepages of the newly-authoritative main sites.

good luck!

Silvery

7:14 pm on Jun 15, 2007 (gmt 0)

10+ Year Member



Also, it still may not be to late to do what I described above.

You might first check to see if those old sites still have links to them from other external sites by doing a link:www.example.com link-check on Google and Yahoo.

If they do indeed have inbound links, I'd suggest you go back and remove those robots.txt instructions that exclude the bots, and set up 301 redirects to your main sites. The old domains can be maintained cheaply with just the redirection going on with them, and it's worth it for the old good-will you may've developed on them.

johnlim9988

12:39 am on Jun 17, 2007 (gmt 0)

10+ Year Member



Silvery,

Thanks and you made a good point on the probelm.

But I have other two questions,

1) If duplicate site A.com have link linkstoAsite.com point to it.

At the same time, this site linkstoAsite.com also link to my aothoritysite.com

If use 301 redirect from A.com to aothoritysite.com, will this do harm to my aothoritysite.com?

2) If A.com has been put robots.txt (no index, no follow) for several months, and no backward links listed already.

If I take out the robots.txt (no index, no follow), will the backward links come back again?

Then when it is the best time for me to put the 301 redirect from such kind of duplicated sites? (Such sites no backward links listed already.) Put 301 now or wait the backward links come out again if possible?

roodle

9:23 pm on Jun 18, 2007 (gmt 0)

10+ Year Member



I recently did something like this for a duplicate site with 2 domain names. I simply made sure all the pages from version B of the site all had 301 redirects to their counterparts on version A. Then I went and checked through Google for inbound links to verison B and contacted all the sites where possible to request they change the domain name to A.

After about 5 months Google now only has 1 page left indexed from version B (plus a couple of sups of pages that no longer exist). The site is just coming out of the woods, so to say, and starting to rank much better.