| 4:43 pm on Apr 13, 2007 (gmt 0)|
I better clarify what I mean by "penalty."
The penalty I worry about is a site-wide penalty. Will having a lot of duplicate content signal to Google that my website is spam or an aggregate site they shouldn't bother to index very often?
I'm less concerned about penalty on the actual duplicated pages themselves. If those pages do get a penalty, I don't mind. I just don't want the penalty to negatively affect the rest of the content that _is_ original.
| 5:28 pm on Apr 13, 2007 (gmt 0)|
You would think that being "first to publish" would protect you, but it doesn't.
If the site that publishes after you is a major authority site for that topic, they will soon replace you in the SERPs.
If you are lucky you drop many places; if you are unlucky you completely disappear.
| 6:00 pm on Apr 13, 2007 (gmt 0)|
Also if Google does not like your site, any scrapper site will be ahead of you in serps.
| 6:22 pm on Apr 13, 2007 (gmt 0)|
>One possible solution is to ask for a period of exclusivity.<
I don't think that would work simply because by the time the poster reachs your site they could have submitted the article to 50 other places. Plus you merely have suggestive power unless you plan to enforce something like that, which would almost be impossible. Plus it would require a huge amount of time. Bottom line that benefits you but not them. So the phrase "what's in it for me" comes to mind.
From what I have observed the priority for a duplicate article goes to the older domain not who posted it first. Also it seems to relate to the article site's size.
Article posters run from site to site to gain links. Many have automated the process. Unless your site has atleast 10,000 articles I would expect many pages to end up in the supps.
| 8:01 pm on Apr 13, 2007 (gmt 0)|
g1smd - I was afraid of that. Thank you for pointing that out.
asher02 - would that apply to just the duplicate pages or would that apply to all my content, event the ones with original content?
outland88 - Ouch. I do not have 10,000 articles and my page is less than 5 months old. Thanks for the warning.
| 8:48 pm on Apr 13, 2007 (gmt 0)|
I have many reprinted articles on my site. Over the years I have found that articles on my site have ended up higher in Google's results than other web sites because Google seems to consider my site an "authority" site. In the short term, I have noticed that Google for the most part ignores new reprinted articles I put on my site. Over the years, however, as other web sites are taken offline and your site is gaining momentum (more incoming links), Google will favor your web site. It also really depends on the niche you're in. A lot of the articles I publish are not published very many other places so that is definitely in my favor. I have a number of articles on my site that I reprinted 7-8 years ago, and Google only has my copy of the article listed and sees it as "original content" because even the author's web site is now gone. For me I have to look at a balance of looking for Google traffic and pleasing my returning visitors. I cannot possibly write enough articles to please returning visitors and to make my newsletter readers happy each week. I would probably make more money if I had all original content, but Google has given high rankings to many of my reprinted articles for a number years (even now), so in the long run it is worth it for me.
To answer your question, however, some reprinted articles rank well for me and others don't. Just depends on the article!
| 6:16 pm on Apr 14, 2007 (gmt 0)|
I did a controlled test on the effect of syndicating unique content. What I found is that the TrustRank or "authority" of the site was the most important factor in controlling duplicate content and supplemental pages for republished articles.
Bottom line, if you want to syndicate your own content or republish someone elses you have to have an authority site.
| 9:05 pm on Apr 14, 2007 (gmt 0)|
That's very helpful guys.
I do think I have an authority site. It has a PR 6 and it has several links from top 100 blogs in general, as well as top 100 blogs in our niche.
But then again, I might just be deluding myself. =)
| 2:49 am on Apr 15, 2007 (gmt 0)|
Do you rank in the top 10 for some highly competitive keywords in your industry? That's how you know you have a "trusted" site.
| 3:35 am on Apr 15, 2007 (gmt 0)|
It seems like I've also read somewhere that there are other factors besides authority, like how clean the url is and the load time of pages, etc. In terms of who gets marked duplicate. It definitely isn't just who owns the material or who is first.
| 5:50 am on Apr 15, 2007 (gmt 0)|
If someone steals your content by syndicating it, it can devalue your site as multiples will occur and those can be more powerful.
Also, another theory seen...
...if the content is stolen in specific regions, it can rank your site in those regions and may cause a loss of rankings in the actual country you are located in.
| 9:20 pm on Apr 15, 2007 (gmt 0)|
Freelistfool, we're not in the top 10 but we are in the top 20.
However, based on reading the "SERP shakedown" threads on here lately, I'm not quite sure if that is a permanent thing. We might be dropped to the boonies next week. =(
Thanks for the additional factors Divad and Optimist. I will definitely take those into consideration.