Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

syndicating wikipedia / duplicate content issues

google's duplicate content rule penalize wikipedia syndication?

         

mcglynn

3:27 pm on Nov 14, 2006 (gmt 0)

10+ Year Member



I've had site owners inquire about syndicating content from Wikipedia as an easy way to flesh out their sites in order to get better search-engine rankings. Their thinking is, "wikipedia has authoritative, relevant content that I can re-use for my site thanks to the GNU Free Documentation License; therefore my rank in search engines will rise because _my_ site will also have authoritative, relevant content."

This seems like a bad idea to me, because I assume Google can detect syndicated wikipedia content. The "Reusers' rights and obligations" for wikipedia require "conspicuous" backlinks within the verbatim copy or derivative work. No doubt such a backlink would tell googlebot where to look for duplicate content. No doubt googlebot would assume wikipedia, not my site, was the original source for the content.

Can anyone confirm whether google or Matt Cutts has addressed this specific example of syndication and duplicate content?

g1smd

1:25 pm on Nov 20, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Google will index a few such sites and then begin filtering the rest... just as they have done with ODP clones.

If they aren't already doing this, then it cannot be long before they do...