Welcome to WebmasterWorld Guest from 54.167.185.18

Message Too Old, No Replies

Artificial vs Organic Links

How does the algorithm detect it?

   
10:55 am on Jul 28, 2006 (gmt 0)

5+ Year Member



Let's say hypothetically I called up every merchant in my industry with a website and somehow persuaded them to link to me.

Now, these would all be highly relevant one-way inbound links except I didn't quite get them 'naturally'. I had to convince them to link to me.

How would the algorithm differentiate it from a set of links which grew organically?

1:31 pm on Jul 28, 2006 (gmt 0)

WebmasterWorld Senior Member quadrille is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Context is a lot of it, probably.

Some suggest that timing matters; a whole bunch of new links every Tuesday is rarely organic.

The anchor text - and the relevance of that to the page - probably matters.

1:41 pm on Jul 28, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



How would the algorithm differentiate it from a set of links which grew organically?

It couldn't!

If I were designing an algo to try to sense where sites are abusing links what I would do is nominate a few sites for a range of related terms and map their links, in and out. I would then have the algo compare other sites with that pattern. If they deviate significantly from the pattern I'd apply a "penalty".

Sid

3:56 pm on Jul 29, 2006 (gmt 0)

10+ Year Member



And hence you are not a search engine...sorry.
3:58 pm on Jul 29, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



lmao!
5:17 pm on Jul 29, 2006 (gmt 0)

5+ Year Member



If I were Google I would simply enable my organic_link_detector() algorithm. It would work broadly as follows:

1) Engage my automated quality sensor to assess if each site linked to/from was a "quality" site.

2) Engage my humanoid coersion detector to assess if each link was obtained through an individual's natural choice, and reject any links that I detect may have been aquired through an un-natural means, e.g. a request by email, by phone, or even by fax.

3) Engage my duplicate content filter, to weed out any page that contains three or more words strung together in an unoriginal manner.

4) Create a white list of sites that are immune to all of the BS from steps 1 to 3.

 

Featured Threads

Hot Threads This Week

Hot Threads This Month