Forum Moderators: Robert Charlton & goodroi
Now, these would all be highly relevant one-way inbound links except I didn't quite get them 'naturally'. I had to convince them to link to me.
How would the algorithm differentiate it from a set of links which grew organically?
How would the algorithm differentiate it from a set of links which grew organically?
It couldn't!
If I were designing an algo to try to sense where sites are abusing links what I would do is nominate a few sites for a range of related terms and map their links, in and out. I would then have the algo compare other sites with that pattern. If they deviate significantly from the pattern I'd apply a "penalty".
Sid
1) Engage my automated quality sensor to assess if each site linked to/from was a "quality" site.
2) Engage my humanoid coersion detector to assess if each link was obtained through an individual's natural choice, and reject any links that I detect may have been aquired through an un-natural means, e.g. a request by email, by phone, or even by fax.
3) Engage my duplicate content filter, to weed out any page that contains three or more words strung together in an unoriginal manner.
4) Create a white list of sites that are immune to all of the BS from steps 1 to 3.