martinibuster - 7:45 am on Mar 28, 2012 (gmt 0)
That's a different issue altogether, imo. The efficacy of that may be affected by the sites that publish the reproduced content. Crap inbound links subjected to depreciation, funky link graph, etc., etc. That's an issue with the sites that are publishing the content, not the technique itself.
I will give you an example of a technique working because of positive signals. I believe this loophole or bug in Google's algo is one of the reasons why copied/stolen content on decent sites can outrank the original documents. Google is making decisions based on quality signals but that allows the technique to work as long as the web pages have enough of the positive signals and lack the negative ones. I just removed a long post by someone that was posting the stolen content across a lot of forums. His documents and those of others who have republished the content are outranking the original publishers. That's a loophole and a bug in the way the algo works, imo.