joined:June 29, 2010
Thanks for your replies, goodroi and tedster.
"In the rare cases in which we perceive that duplicate content may be shown with intent to manipulate our rankings and deceive our users, we'll also make appropriate adjustments in the indexing and ranking of the sites involved."
"When in doubt, I'd do what makes most sense from a user perspective (what is most helpful to human visitors). You could also block those pages using robots.txt if you're worried about how search engines will view them."
I can't see a negative effect for the searchers, no one is "deceived". It should be easy for google to filter my articles if they don't want to show the articles on my site and prefer to show them on a site which can not ensure you to get the requested content because of an insane amount of server errors.
"with intent to manipulate our rankings" Google
"inflating content only to rank better" tedster
Although I know that there is no definite answer: Google doesn't know my intention. Is a well rewritten article with relevant links to my unique content more dangerous for my site than an unedited version is? On one hand I may seem to have some sort of manipulative intention, on the other hand I signalise that I care about my content...
My current opinion: the most important factor is the amount of added value and the quality of the rewriting. Stupid keywordstuffing is not the solution. I will start with a test sample and gradually increase the amount of content. I suppose me and goodroi don't think that releasing 200k at once is a good idea.