Shaddows - 8:49 am on Aug 22, 2012 (gmt 0)
Google thinks its generic algorithms can determine the nuanced content differences between pages. It can't.
I think, more to the point, more than 10 sites regurgitate the same generic information. Probably more than 1000. In such a situation, 990 of those sites cannot possibly rank.
Google does not need to have confidence in its ability to surface the correct page, to make the entirely logical deduction that each of those sites (on my numbers) has a 1% chance of ranking, and thus expecting to rank is prone to disappointment 99% of the time.
And thats before you even think about 10 sites giving qualitively better information.