Forum Moderators: Robert Charlton & goodroi
About we years ago, we are not very familiar with the essence of seo, and built many sites just according to different categories.
Now we understand that was wrong and there are a lot of duplicate contents. Now we only concentrate on two sites (mainly acturally just concentrate on one site) and write a lot of unique contents on the only two sites.
Now the problem is how to deal with so many former duplicate sites....About 4 months ago, we put robots.txt to those duplicate sites and put nofollow, no index for the robots.
My question is is it enough to just put the robots.txt? will this continue affect our main site?
Thanks.
Now the problem is how to deal with so many former duplicate sites....About 4 months ago, we put robots.txt to those duplicate sites and put nofollow, no index for the robots.
Have those sites been removed from the index? Or, can you still find them using site: searches? If you can still find them in Google, you may want to go one step further and follow the instructions here...
How do I use the URL removal request tool?
[google.com...]
Just be sure that you want to permanently remove those references. :)
After we put robots.txt and ask robots no follow, no index to the duplicated stes. The the search engine like google etc have no index for the duplicated sites now. (If check by site:domain.com)
But now the problem is that our main site with many unique contents still not get good ranking at goole, we are confused and worried.
What should we do to let the ranking of main site come back?
Thanks.
[webmasterworld.com...]
If those old sites had been around for a long while, they may've developed their own sets of in-bound links from other sites and PageRank.
Merely changing their robots.txt to halt inclusion of them in Google's index would not have transferred their good-will (PageRank) back to the desired core sites. Yes, that would've removed the duplication issue, but it would've thrown the PageRank baby out with the bathwater.
Most-desirable would've been to use 301 redirects from each of those old sites's pages to their equivalent pages at the main domains. That would've effectively transferred all their PageRank to the main sites and removed the duplication issue.
Programatically, I know it can difficult perhaps to make sure each of the old pages is 301ed to the right page at the other sites. If that's the case, it could be sufficient to make all the deep-linked pages redirect to the homepages of the newly-authoritative main sites.
good luck!
You might first check to see if those old sites still have links to them from other external sites by doing a link:www.example.com link-check on Google and Yahoo.
If they do indeed have inbound links, I'd suggest you go back and remove those robots.txt instructions that exclude the bots, and set up 301 redirects to your main sites. The old domains can be maintained cheaply with just the redirection going on with them, and it's worth it for the old good-will you may've developed on them.
Thanks and you made a good point on the probelm.
But I have other two questions,
1) If duplicate site A.com have link linkstoAsite.com point to it.
At the same time, this site linkstoAsite.com also link to my aothoritysite.com
If use 301 redirect from A.com to aothoritysite.com, will this do harm to my aothoritysite.com?
2) If A.com has been put robots.txt (no index, no follow) for several months, and no backward links listed already.
If I take out the robots.txt (no index, no follow), will the backward links come back again?
Then when it is the best time for me to put the 301 redirect from such kind of duplicated sites? (Such sites no backward links listed already.) Put 301 now or wait the backward links come out again if possible?
After about 5 months Google now only has 1 page left indexed from version B (plus a couple of sups of pages that no longer exist). The site is just coming out of the woods, so to say, and starting to rank much better.