claaarky - 11:28 am on Sep 26, 2012 (gmt 0)
At the risk of sounding like a broken record, I just don't see how Google could reliably judge the quality of every page on every site by analysing the content.
Quality is not something you can programme into a computer. It's a human perception that changes every day based on our own life experiences, sites we visit, world events, seasons, fashion. A page that's quality today may not be tomorrow for those very reasons.
The way to judge which pages and sites people regard as quality and relevant, is to capture data on how people behave and react. Compare the metrics of sites in a niche and the low quality ones will stand out like a sore thumb. Work out the average, build in an allowance so you don't catch borderline cases and you have the basis of a system to demote the worst sites.