buckworks - 5:30 am on Feb 4, 2011 (gmt 0)
I don't have any answers either, Ted, and it worries me too. The biggest hazard of manual banning that I can see would be judgements made by people who only have a superficial knowledge of the subject area they're assessing.
Two facts loom large in the present situation: (1) Google has traditionally put a lot of value on independent editorial links, and (2) strong search rankings are some of the best link bait there is.
Several years ago Mike Grehan described what happens in his article Filthy Linking Rich And Getting Richer! [e-marketing-news.co.uk...]
Content whose main virtue is being easy to find will end up getting linked to more often, liked more often, tweeted more often and so on, than better content which was written with less knowledge of how to suck up to the search engines.
A lot of dubious content has ended up with stronger "signals of quality" than it deserves, for no other reason than that the search engines granted it higher visibility. It's famous because it's famous, not because it's good.
An even bigger problem is that when there's so much plagiaristic sludge at the top, it's a strong disincentive for genuine subject matter experts to write much.
"Google's mission is to organize the world's information and make it universally accessible and useful." [google.com...]
That sounds noble ... but Google has been creating distortions in the world's information even as it tries to organize it.