Forum Moderators: Robert Charlton & goodroi
It is only on your site once, so no duplicate content there, of course.
However, this page of content does exist elsewhere on the internet.
I know all about copyscape, etc. I am trying to add value for customers, visitors. The news really does add value, and will be of great interest to people on the page with the news link. Indeed, I feel that there is a good chance they would not find this information otherwise if tit were not for my site - my site will effectively organize the news by topic from multiple sources and gather all of this news into one location. And, many news agencies delete the news after a period of time, so it cannot be found at that source again anyway, so I also be "saving" it.
So, the question is: will there be a duplicate content penalty for doing this?
I have seen some posts here from 2005 and 2006 from peopel saying they've being doing this for years, and it is a non-issue. But I am unsure if everyone still feels this way now.
Forgive me, but I am looking for facts derived from experience, and not just speculation.<grin> My mind is a blur as it is with lots of speculation on this. ; )
I have seen many posts from g1smd on duplictae content, I've even posted on duplicate content here a lot, but I am uncertain about duplicate content across domains.
Thank you!
However, if there are any possible chances the content is a duplicate, either within your domain or across multiple domains, then you must disallow robots (in robots.txt) from getting to it so they can't index it, which will cause you duplicate content penalty grief.
i.e. duplicate content penalties will result even if the content is a duplicate across domains.
Is the above correct? Must you disallow content like this in robots.txt, because you will get a dupe content penalty otherwise?
In the rare cases in which we perceive that duplicate content may be shown with intent to manipulate our rankings and deceive our users, we'll also make appropriate adjustments in the indexing and ranking of the sites involved. -Adam Lasnik on Duplicate Content [webmasterworld.com]
When it come to news content, it's pretty easy to find more than one domain in the results with the same story.
helpnow, in my opinion, your advice is more geared toward duplication within a given domain.
I get the impression that Google may now consider duplicate content all the same, whether it is within a domain, or across separate domains.
The pages don't get filtered out of the SERPS so they don't exist any more - they're still exist in the SERPs, but they are shoved down the rankings.
I know that dupe content within a site will affect every page's rankings, even pages not involved in dupe content. This makes sense to me for URLs involved in the dupe content because of the possible spread of PR amongst all the dupe content URLs, and thus, none of them has enough PR any more to rank very well. However, pages not involved in the dupe content losing rankings doesn't make sense to me, but it still happens and has happened to me. But that is another thread. ; )
What I am trying to get a finger on is, does this same phenonemon of lost rankings occur with dupe content across separate distinct domains? Has anyone experienced this? Has anyone got dupe content that exists on other domains, and their rankings remain intact?
(Marcia, sorry, I am trying to avoid issues of permission in this discussion. Whether permission has or has not been granted is unknown to Google, and I suspect rather moot, unless the owner complains to Google. Consider a manufacturer who grants permission to a retailer to use any of the manufacturer's content - the same possibility of dupe content arises.)