tedster - 9:36 pm on Feb 27, 2011 (gmt 0)
I don't think any mere tweak is going to spot the kind of low value that ehow seems to offer so much of the time. The problem is they do use literate writers who create grammatical sentences and paragraphs. The pages do have semantic variety. The number of words goes beyond a stub page.
The way you discover that an article is crap is by reading it all the way through and discovering that it told you nothing useful. So there's not even a fast bounce back to the SERPs for a next choice.
People know when an article is crap because we comprehend meaning. Machine algorithms do not do that. Well, maybe Watson comes close.
@assabia, the timeline of that Google Trends graph doesn't even cover the update period yet.