Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Duplicate content stringency

How tight does G look at when comparing two pages?

         

Tapolyai

10:56 pm on Feb 5, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I have few hundred unique articles on my site currently.

I am interested in distributing some of them for free, with links back to my site.

How close Google looks at a page for content comparison? Will my formating and presentation(through a CMS/Forum software) make it "different"? Or, is it percentage which looked at? If percentage, how much is too much?

lammert

11:40 pm on Feb 5, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I do not know percentages, but I do know one site in my niche which provides free articles. They are a non-profit organization and write really good content about specific issues. I have found about 10 sites that have copied the content of this site (legally) and normally for some queries at least one of these sites, or the original site would be at top position in the SERPs.

Since BigDaddy, Google uses a new algorithm for canonicalization. Now ALL of the sites--including the original one--have disappeared from the top positions in the SERPs. It seems that if on the Bigdaddy infrastructure Google can't determine which URL is the original one, it rather doesn't list any of them.

I would also like to quote from Matt Cutts' blog about Bigdaddy and 302 redirects:

Matt says:
My only point is that the new infrastructure at the Bigdaddy data center will let us tackle canonicalization, dupes, and redirects in a much better way going forward compared to the current Google infrastructure.

So my advice, don't distribute your content at this moment, because the roll out of Bigdaddy might cause some real problems.

Tapolyai

1:11 am on Feb 6, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I see. Thank you.

If I do not have it already published on the site, then I presume this is not an issue. Correct?

annej

5:43 am on Feb 6, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Along the same lines I used to share articles now and then but not since duplicate copy has become a problem.

Now I’d thinking of doing some articles for other sites. If I took a couple of related articles and wrote a shorter version combining the two would that be enough change to avoid duplicate copy problems? How different from the originals do the new articles have to be?

lammert

8:33 am on Feb 6, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I searched a little bit further. In the same niche, there is also a wiki project. This project started back in 1996, and since then about 80 sites have copied the contents (also legally). For one specific query I always had at least three of those copies in the first 15 listings in the SERPs. Now with Bigdaddy, only one copy shows up at position #2, and in the next 250 listings, none of the other copies is present.

IMHO, since the roll out of Bigdaddy, all the knowledge gathered in the past about percentages duplicate content allowed by Google is obsolete.

annej

8:51 pm on Feb 6, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



all the knowledge gathered in the past about percentages duplicate content allowed by Google is obsolete


I think a lot might as well be put on hold until the changes that Google is making are completed. I'll just keep working on content and put off any article sharing plans until later.

Ev_olution

3:16 am on Feb 7, 2006 (gmt 0)

10+ Year Member



I have an article that ranks well with an affiliate on Google. This same article has also been syndicated through article marketer, both have links. Is that a problem for my high ranking affiliate? How does BigDaddy affect a service like PRweb?

Thanks you,
Ev