econman - 7:28 pm on Apr 1, 2011 (gmt 0)
Granting that's true, you seem to assume that this "minimum quality threshold" would be "just barely better than crappy." Why? Why couldn't Google release a threshold definition that's "pretty darn good"?
Google says it is attempting to detect and downrank "low quality" pages/sites. They've said nothing about below-average or high quality.
Consider what would happen if they were to set a higher standard, where any page or site that is below par, (where "par" is set at "pretty darn good") would be pushed down the SERPs: Many more sites would be affected and the tradeoff between quality and relevance would be much more severe. As well, the collateral damage from flaws in the system would be far more severe than anything we've seen with Panda.
Aside from that, there is also the problem that their system is probably a lot weaker than they would like it to be, and the more they tell us about their criteria for "quality" the more obvious the flaws would be. Can you imagine how bad the PR would be if they articulated a clear and unambiguous definition of poor quality? Everyone would start noticing all of the instances in which a poor quality page ranks highly, and a high quality page ranks poorly.