bigjohnt - 12:01 pm on Mar 29, 2007 (gmt 0)
I believe they have started discounting links that are exactly the same across the group of inbound links, in favor of latent semantic indexing.
It does not seem natural, for example, that all 2,500 links to a site would have the same exact phrase, and therefore would be flagged, and discounted a bit.
What the parameters are we don't know, but this has been discussed in a few places and it makes sense to me.
If G can check a page using LSI, checking a bucket full of links can be done. Heck, even I can do a decent job of it one site at a time.