santapaws - 4:22 pm on Oct 24, 2012 (gmt 0)
for example scraping, you cant do that, and when im personally scraped i get hammered by the duplicate content, apparently because i dont have enough (cough cough) good signals. But two of the biggest sites out there scrape massive amounts of content, Google and Wikipedia. Yes i know they do OTHER things too, but clearly you CAN rank with scraped content IF it suits them. My point being its being made impossible more and more for anything other than a major player to rank BECAUSE of the so called guidelines. I think thats how they want it. De-optimise to the point of self destruction and they have the job done for them, no algo required. So its now about the whitelist that lets some sites through than anything else. Of course your vertical will see this play out in a big way or smaller way depending on whether its a targeted niche.