homepage Welcome to WebmasterWorld Guest from 54.227.182.191
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe to WebmasterWorld

Home / Forums Index / WebmasterWorld / Content, Writing and Copyright
Forum Library, Charter, Moderators: not2easy

Content, Writing and Copyright Forum

    
Does Entire Site get Penalised for Duplicate Content?
What happens I load duplicate content onto an otherwise original site?
Tigrou




msg:920785
 11:03 pm on Apr 25, 2005 (gmt 0)

I have a site with original content. Ranks well and users like it etc. but it only has 200 pages.

I want to add on 800 more pages of well written and useful content. The problem is this new content isn't original (public domain). I BELIEVE that I've mixed up text/layout enough so it is considered original...

...BUT if I am wrong then will the SEs just ignore the 80% of the site that isn't "original" in their eyes? Or will they penalise the site's original 200 pages for being associated with "duplicate content"?

Thanks.

 

monkeythumpa




msg:920786
 6:17 pm on Apr 26, 2005 (gmt 0)

Search engines don't rank sites, they rank pages.

Tigrou




msg:920787
 6:42 pm on Apr 26, 2005 (gmt 0)

monkeythumpa, pages interconnect and SEs rank that interconnection. Throw 80 pages of white-link spam with 5 H1s on a site with 20 otherwise good pages and you'll see the 20 good ones fall.

Do you have any specific feedback/proof that this doesn't occur with flagged dupe content?

Thanks.

BigDave




msg:920788
 11:02 pm on Apr 26, 2005 (gmt 0)

Search engines don't rank sites, they rank pages.

Often quoted and just as often wrong.

It would be absolutely stupid for a search engine to not consider the site factors while ranking the page. And search engine engineers ain't stupid.

As for the dup content, I believe that Google generally only filters out the duplicate pages. But I suppose that there might be some sort of threshold where they might just assume that if 80% of the site is duplicate that the other 20% is too.

Marcia




msg:920789
 11:15 pm on Apr 26, 2005 (gmt 0)

Google is forgiving to a degree, dup pages get axed. With Yahoo be very, very careful. MSN is still getting their sealegs, no idea.

Tigrou




msg:920790
 1:35 pm on Apr 27, 2005 (gmt 0)

Thanks for feedback. I'll go for it and give some feedback in a few months.

Since my new content would actually be useful and mixed with other fresh/public stuff it shouldeasily pass a hand test.

Knowing yahoo with other filters, then tend to use a hand check if an autoreport is "iffy".

(This though assumes that filter is setup to return as a analog dup % and not a binary "dupe/non-dupe" )

Teshka




msg:920791
 11:05 pm on Apr 28, 2005 (gmt 0)

I have a site that had good links but was about 90% duplicate content. It barely got any traffic at all from Google (though MSN and Yahoo didn't seem to mind) even though it was out of the sandbox. A few months ago, I decided to start writing some articles, and now it's probably about 50% original content and 50% duplicate content. I have started getting a lot more traffic from Google. I think whatever penalty I might have had is gone.

I know this doesn't exactly answer your question, but just based on my experience, I think it may be a percentage kind of thing. Some duplicate content may be fine as long as you continue adding original content. If you have something like four times as much duplicate content, then you might need to worry about being penalized.

Tigrou




msg:920792
 11:18 pm on Apr 28, 2005 (gmt 0)

Teshka, thanks a lot. That was a very useful and interesting post.

SlimKim




msg:920793
 2:44 am on May 5, 2005 (gmt 0)

interesting thread,

i have wondered if a site consisting of mostly articles generated by a news feed would do well at G, Y and MSN

or if the duplicated articles would be a problem

howiejs




msg:920794
 5:28 pm on May 13, 2005 (gmt 0)

"i have wondered if a site consisting of mostly articles generated by a news feed would do well at G, Y and MSN"

I have been testing this a bit -- still to early to tell. Google seems to be indexing the pages (decent site already) -- but no traffic on the news ones yet

Any other comments on the above?

Tigrou




msg:920795
 7:13 pm on May 13, 2005 (gmt 0)

Howiejs, no comment on that quote.

I haven't implemented the original point. That particular site went offline for a few days due to a foolish host. Will try it when things are stable again.

Tigrou




msg:920796
 5:57 pm on Jun 13, 2005 (gmt 0)

Hi howiejs,

Sorry for lack of update, but even with being down for a week <painfully long story>, that site was treated well by Bourbon so now I'm risk adverse and want to avoid dup filtering the entire site.

Now, I've downloaded a wack of free (and useful) info and humans are editing that offline. I'll supplement each page that goes onto with info from a mix of sources on all pages. Problem is that this takes time (and a bit o money) so it'll only be up in July.

Sorry I can't give a better update.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / WebmasterWorld / Content, Writing and Copyright
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved