How would the algorithm differentiate it from a set of links which grew organically?
If I were designing an algo to try to sense where sites are abusing links what I would do is nominate a few sites for a range of related terms and map their links, in and out. I would then have the algo compare other sites with that pattern. If they deviate significantly from the pattern I'd apply a "penalty".
If I were Google I would simply enable my organic_link_detector() algorithm. It would work broadly as follows:
1) Engage my automated quality sensor to assess if each site linked to/from was a "quality" site.
2) Engage my humanoid coersion detector to assess if each link was obtained through an individual's natural choice, and reject any links that I detect may have been aquired through an un-natural means, e.g. a request by email, by phone, or even by fax.
3) Engage my duplicate content filter, to weed out any page that contains three or more words strung together in an unoriginal manner.
4) Create a white list of sites that are immune to all of the BS from steps 1 to 3.