Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Duplicate Content - Does it hurt the site, or just the page?

         

peterdaly

6:52 pm on May 19, 2008 (gmt 0)

10+ Year Member



There's a large section of a site that I am working on that has duplicate content from other sites due to syndication. The site is a content "consumer", not the producer. We're not talking press release size syndication...the content is used on maybe 1-5 other sites.

From performing searches on Google for unique sentences (in quotes) within the content, it's obvious many of the pages have either a duplicate content penalty or filter being triggered.

Do the penalized pages only hurt their own traffic, or does the duplicate content penalty or filter hurt the rest of the (unique) pages on the site as well?

Although many of the pages in this section have duplicate content in their body (not the navigation etc), on the whole, this section is one of the main traffic drivers on the site.

What's the best course of action? Should I triage and robots.txt all the "non-performing" pages? Should I leave it as it is? Do something else entirely?

This is a "large brand" site.

tedster

7:52 pm on May 19, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



By far the most common experience is a page-specific filtering, removing all but what Google judges to be the "best' one. But sometimes if a domain is seen as being over-filled with duplicate content, then there can also be a domain penalty.

Google's intent is that their user's should not see multiple coopies of the same content in their search results. What you need to do is to find a way to add real value to your pages for the end-user, so that they are not a mere duplicate of what is available on other websites.

peterdaly

8:36 pm on May 19, 2008 (gmt 0)

10+ Year Member



tedster,

I know and agree.

1. This isn't my own site.
2. This content will not be going away.

Trying to take lemons and make lemon-aid, given the parameters I need to work in.

kidder

12:17 am on May 20, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If the site is strong enough it can and will outrank other sites with the same or similar content from what I have seen on my own sites. It's a question of muscle, if you have them you can use them so to speak. If you have excessive amounts of duplicate then your looing for trouble.

Huntster

3:25 am on May 22, 2008 (gmt 0)

10+ Year Member



If the site is strong enough it can and will outrank other sites with the same or similar content from what I have seen on my own sites. It's a question of muscle, if you have them you can use them so to speak. If you have excessive amounts of duplicate then your looing for trouble.

I agree. I have some sections on a pretty large site that is all unique - meaning we wrote it, but dupes on some groups and no harm done.

[edited by: Huntster at 3:25 am (utc) on May 22, 2008]

Robert Charlton

6:24 am on May 22, 2008 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



What's the best course of action? Should I triage and robots.txt all the "non-performing" pages? Should I leave it as it is? Do something else entirely?

The problem with the robots.txt approach is that if someone linked to the non-performing pages, the links wouldn't be followed or credited. Conceivably, if this dupe content is buried in archives on the other sites, it might not take much in the way of inbounds to the pages on your site to push your versions to ranking status.

I don't know whether it's in the cards for you to add introductory content and postscripts to the duplicated material. That also might be enough to snap the pages out of dupe status, particularly if your commentary is in itself worth linking to. Intro pages that attract links and spruce up the whole section might also be a way of boosting up the section as a whole.