homepage Welcome to WebmasterWorld Guest from 54.163.91.250
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
advice needed on 'duplicate' content / themes
joergnw10




msg:4057264
 10:30 am on Jan 9, 2010 (gmt 0)

I would be grateful for some advice on how to implement a change to the content of my sites.
Here the scenario:

I have 2 websites that are in my name and on the same server. One of them supplies more general information, the other one is a commercial site.
Both are related to the same niche (area) and have been doing very well in the serps for a few years now. There are only 2 links going from the general to the commercial site.

I would like to add a smallish product section to the site with general info and already have set things up so I would avoid real duplicate content - different layout, different descriptions, titles etc. I now realised that, as it is a shared database, all the product images have url's from the commercial site, meaning loads of links going to the commercial domain.

I am therefore leaning towards blocking the directory containing the new product section with robots.txt & adding noindex tags to the pages. I don't really need to rank for these pages.
Would this be enough or would there still be a risk of some penalty?

Also I am not sure what to do with the 'landing page'. There would be a link to it in the menu, so on every page of the site (and the link-term would have to be related to the products). Should I add a rel=nofollow to that link? Or should I let Google index that page and just disallow all the other pages?

Could it be that just the fact that the second site would now also have a relation to this specific commercial sector triggers a penalty?

[edited by: tedster at 6:07 pm (utc) on Jan. 9, 2010]
[edit reason] member suggestion [/edit]

 

tedster




msg:4057435
 7:26 pm on Jan 9, 2010 (gmt 0)

Just the robots.txt alone is a good plan and will avoid any duplicate content conflict the other website's rankings. The meta tag is OK to do, too, mostly as a backup for safety sake. But googlebot needs to spider a page in order to see the meta tag in the first place, and with the robot.txt disallow rule, it will not do that.

The rel=nofollow attribute is fine to add to the links - it would help to keep the urls from accumulating link juice and then potentially showing up in the SERPs as a "url-only" listing.

However, true duplicate content penalties are very rare. What happens (except in the case of serious spamming and scraping) is that all but one version of the content is filtered out of any particular SERP, but there's no penalty against the domain. See Duplicate Content Demystified [webmasterworld.com].

That thread is easily found through our Hot Topics area [webmasterworld.com], which is always pinned to the top of this forum's index page.

joergnw10




msg:4057471
 8:17 pm on Jan 9, 2010 (gmt 0)

Thanks for your reply and pointing out the threads! Sounds like I could probably get away with not blocking the 'new' content, but I think I will rather be over-cautious and go ahead with it.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved