In general speech people use the phrase "duplicate content penalty" when in reality, such a thing is rather rare. What happens, in the case of cross-domain duplication, is that Google may filter out some of the duplicate urls from a given search. They do make an effort to display the original source when they can detect it, and having the article link back to your site is the recommended approach.
You can always wait to give permission for reprinting an article until after you see your url showing in the search results - that's an added layer of protection.
I'd be careful about letting a more established site use your content. I've got an article that I let someone else publish on their site with a link back to the original page on my site. It's been two years and his copy of my content shows in the serps even though the page on my site has so many back links to it that it's a PR 5 (his copy is a PR 2).
His site had more "authority" at the time the article was published. That seems to be the key as I've got other articles that I've let people copy on less "authoritative" sites than my own that rank just fine.
I would beware of letting a more authorative site have a copy of your content. That is a recipe for you to fail in the SERPs.
thanks for those replies...yes we have come round to thinking that 'we write the articles so we'll use them only on our website'
It can only enhance our website and if content is fresh,unique and up to date then hopefully sites will link back to ours naturally....
From what I understand, if the original content can't be determined, preference will go to the higher PR'd site.
They do some to determine "original", basically what went online first. But there is no guarantee that will matter when confronted with a page with higher PR, authority, etc.
This is especially risky if you ever move a page, even with a 301, after the more established site has the content since now that other URL will have had the content longer.
Have the new/bigger site noindex,follow the page. They get the content, and you get some visitors, but there is no duplicate content.
It's not just article writing that is at risk. I think any site distributing content needs to pay attention to how the distributor links back to your site.
If the direct linking is for 1 or 2 articles, it may be less of an issue, provided your site has good PR [ ie strength ] . But if you're syndicating large amounts of content to one site, and it's linking back to the original, particularily on multiple identicle pages, you may find the originating website could be sunk. Such a risk is highest with affiliate distributors .
Yep - that's the answer, draw up an agreement that says the site must exclude robots with a robots.txt.
you could rewrite the article for them. That way everyone's happy.
(a) they get original content
(b) you get your high PR link
Just make sure they don't link back to the original article as this won't be a great user experience.
Get them to link to another related article or another page in your site.
If you want to experiment, you could try letting them republish just one article, and then see what effect it has on the SERPS.
just get them to put it into a folder listed in their robots.txt file and/or make sure there is a meta noindex nofollow in each page.
What I usually do when I republish articles is to make sure that there is a full URL to the original article page. This way google can find out that I am not the author of the article as it can follow the exact url. Just make sure that your url points exactly to the original source on your site, and you should be fine. I have been doing this for over 5 years and I make every author happy.
Offer to do a rewrite for them, paraphrase the original content, and place links in the content to the original article. Win / win.