Welcome to WebmasterWorld Guest from

Message Too Old, No Replies

advice needed on 'duplicate' content / themes



10:30 am on Jan 9, 2010 (gmt 0)

10+ Year Member

I would be grateful for some advice on how to implement a change to the content of my sites.
Here the scenario:

I have 2 websites that are in my name and on the same server. One of them supplies more general information, the other one is a commercial site.
Both are related to the same niche (area) and have been doing very well in the serps for a few years now. There are only 2 links going from the general to the commercial site.

I would like to add a smallish product section to the site with general info and already have set things up so I would avoid real duplicate content - different layout, different descriptions, titles etc. I now realised that, as it is a shared database, all the product images have url's from the commercial site, meaning loads of links going to the commercial domain.

I am therefore leaning towards blocking the directory containing the new product section with robots.txt & adding noindex tags to the pages. I don't really need to rank for these pages.
Would this be enough or would there still be a risk of some penalty?

Also I am not sure what to do with the 'landing page'. There would be a link to it in the menu, so on every page of the site (and the link-term would have to be related to the products). Should I add a rel=nofollow to that link? Or should I let Google index that page and just disallow all the other pages?

Could it be that just the fact that the second site would now also have a relation to this specific commercial sector triggers a penalty?

[edited by: tedster at 6:07 pm (utc) on Jan. 9, 2010]
[edit reason] member suggestion [/edit]


7:26 pm on Jan 9, 2010 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

Just the robots.txt alone is a good plan and will avoid any duplicate content conflict the other website's rankings. The meta tag is OK to do, too, mostly as a backup for safety sake. But googlebot needs to spider a page in order to see the meta tag in the first place, and with the robot.txt disallow rule, it will not do that.

The rel=nofollow attribute is fine to add to the links - it would help to keep the urls from accumulating link juice and then potentially showing up in the SERPs as a "url-only" listing.

However, true duplicate content penalties are very rare. What happens (except in the case of serious spamming and scraping) is that all but one version of the content is filtered out of any particular SERP, but there's no penalty against the domain. See Duplicate Content Demystified [webmasterworld.com].

That thread is easily found through our Hot Topics area [webmasterworld.com], which is always pinned to the top of this forum's index page.


8:17 pm on Jan 9, 2010 (gmt 0)

10+ Year Member

Thanks for your reply and pointing out the threads! Sounds like I could probably get away with not blocking the 'new' content, but I think I will rather be over-cautious and go ahead with it.

Featured Threads

Hot Threads This Week

Hot Threads This Month