falsepositive - 7:54 pm on Feb 26, 2011 (gmt 0)
Some speculation on how the algo could work?
1. The "content farm" algo is tied to the "duplicate scraper" algo. The duplicate scraper algo cleaned out scrapers but kept legit sites/original sources up. This algo identifies scrapers.
2. The "content farm" algo imposes a site wide penalty on you if you have enough copied/repurposed stuff on your site. THAT is what pushes you down below your scrapers, esp. if you are a big site. There is a trigger on how much of your content has been copied verbatim based on percentage of duped content.
3. They will then run the "duplicate scraper algo" to clear out the crap/garbage. There was collateral damage on the first phase ("content farm" clean out phase) that marks you with a sitewide demotion/penalty. However, your penalty will potentially be lifted once the duplicate scrapers are identified and removed.
Maybe the definition of "quality" of your site is based on that sitewide parameter. Your authority is not the issue here, it is if you triggered this new penalty that marks you as copying/regurgitating. G could only add that sitewide penalty after testing the duplicate scraper algo and making sure it worked.
There's basically an amount of uniqueness measured here, which identifies "quality".
Just speculation of course.