homepage Welcome to WebmasterWorld Guest from 54.198.8.124
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / WebmasterWorld / Content, Writing and Copyright
Forum Library, Charter, Moderators: not2easy

Content, Writing and Copyright Forum

    
Duplicate content issue - What's the truth?
Dectomax




msg:919486
 11:11 pm on Mar 15, 2006 (gmt 0)


Hello there,
This is my first post here so please be gentle with me.

I've been reading post after post after post trying to clear up the whole well trodden website content\article submission issue. I cannot seem to get a definitive answer. Basically my confusion is this.

Letís say I write a new, informative article for my website which I am very proud of and then leave it for a few months until it is indexed by the major search engines.

I then submit this identical article to an article directory with the aim to get a few new incoming links\a bit of extra traffic\opportunity to show off my writing skills\expertise etc. I understand at this point that I may not improve my page rank or positioning but surely the search engines won't punish me by treating the original indexed article as duplicate content and filtering it as such, just because it has appeared somewhere else two months later?

I would have thought that the indexed article is treated as the original and all subsequent appearances of the article are classed as duplicates.

If this is not the case, surely you could destroy your competition by duplicating all of their website content on several different websites and ruin their rankings.

I would love to know that I could submit my existing work to other pages without having to rewrite what may already be a very good article.

Please help clear up my confusion.

 

Stefan




msg:919487
 11:21 pm on Mar 15, 2006 (gmt 0)

Welcome aboard, Dectomax.

surely the search engines won't punish me by treating the original indexed article as duplicate content and filtering it as such, just because it has appeared somewhere else two months later?

The duplicate problem will only kick in if the pages are substantially the same, including layout/code/the-whole-shebang. If the only thing the same is the text for the article, you probably don't have anything to worry about. When you read posts here about canonical problems/hijacking/etc, and dupe content, it's because the pages are seen as exactly the same.

That said, you run the risk of having your own article get pushed down in the serps if the new site that posts it has better Page Rank, etc. You'll still be there, but lower than them. You have to have a good look at the site that you're contributing the article to, and play it by ear.

jomaxx




msg:919488
 3:29 am on Mar 16, 2006 (gmt 0)

I disagree that HTML markup plays an important role in identifying duplicate content. Why would it? They're looking to eliminate duplicate content, not simply site mirrors.

Stefan




msg:919489
 4:11 am on Mar 16, 2006 (gmt 0)

Ok, that might be true (although I'm not sure if it's the case). Nevertheless, I was trying to assure the original poster that his articles posted on other's sites would not trip a dupe content filter.

I'm having a deja vu here on a similar thread months ago. Jomaxx, could you explain to me why two pages with identical code will not be seen as dupe content? If they weren't, what exactly would be dupe content, and why would anyone ever have canonical problems. Not meaning to be argumentative, but I really don't understand that.

jomaxx




msg:919490
 5:47 am on Mar 16, 2006 (gmt 0)

Actually with respect to canonical issues in particular Google might well look at everything, because the point there is to determine if domain.com is identical to www.domain.com, for example.

But the duplicate content issue is different from that, and IMO Google is likely to look mainly or wholly at the text and not at the markup. Identifying DMOZ clones is a classic use for the duplucate content filter. Each site could have a completely different URL structure and page layout, yet the text itself is going to be substantially duplicated.

As for the original poster's concern, I don't know if it's well established what criteria Google use to determine which site is authoritative. My gut feeling is that it has less to do with which site was in the index first, and more to do with the same factors that determine which page would get ranked higher.

Dectomax




msg:919491
 12:07 pm on Mar 16, 2006 (gmt 0)

Thanks for the replies.

I understand that the Article directory hosting my published article may rank higher than my page but that's not a major concern. My main concern is whether or not my original page will be penalized as duplicate content.

There must be some way the search engines ascertain which is the original article, otherwise couldn't a popular well ranked site strip the content from a totally original yet not so well ranked site and cause the not so well ranked site (with all the original content) to fall out of the SERPS under a duplicate penalty?

larryhatch




msg:919492
 12:21 pm on Mar 16, 2006 (gmt 0)

Decto: It sounds like you have some writing skills, how about this?

Rewrite the same article for submission to the high-PR site, reword this and that etc..
Enough small changes and the dupe-content issue is laid to rest. You remain the author.

Make SURE the larger site links back to YOU. That would be my concern. -Larry

urlreader




msg:919493
 8:34 pm on Mar 16, 2006 (gmt 0)

it really depends on how SE defines 'duplication'. simple HTML will be striped out. however, if you change the words there and here, there is no way SE can detect it. simply because the technology is not there yet.

url

Stefan




msg:919494
 11:47 pm on Mar 16, 2006 (gmt 0)

Good advice, Larry.

Jomaxx, you might be right, but I was always under the impression that G only applied the filter to pages that are essentially identical (I might be wrong). I see many pages in the serps that have the exact same "articles" in most of the top ten results. For the Dmoz and Wiki clones, I always assumed they were just buried in the serps, but not removed entirely - you do find them showing up a few pages down.

I do know that articles that I've contributed to other sites, word for word, usually appear slightly below my originals for the appropriate kw's, but both versions still show-up just fine.

Decto, Larry's advice is best anyway - just do a bit of a rewrite (probably doesn't have to be much), and make sure your version has a few more mentions of the kw's that you're aiming for.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / WebmasterWorld / Content, Writing and Copyright
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved