... is it so unreasonable for Google to simply filter all the pages and rely on reinclusion requests to correct the few instances where the duplicate content might be legitimate and in keeping with Google's stated mission?
Yes it's unreasonable for them to filter out most of the pages, if it helps them battle spam. OTOH, it's bad for the user and the SERP's to filter out all of the relevant pages.
... few instances ...
The reason I posted about the issue at all is that I believe we've crossed the threshold into way more than a few instances. And this is part of a disturbing trend to accept ever increasing levels of collateral damage, presumably in the fight against spam.
In situations like the bee examples above, why shouldn't the burden be on the publisher to demonstrate that the duplicate content is legitimate and of value to users?
If G cares about the quality of their SERP's, as they surely do, the onus is on them to try to show the best pages possible for a given search. The less able they are over time to achieve this goal, the worse it is for all concerned parties: G employees, G shareholders, and G users. Deterioriating quality helps no one.
I did not say that fighting spam was easy. But if G is to retain its standing then they must find ways to control spam without accepting ever growing levels of collateral damage. Sort of obvious on its face, I would think.