Welcome to WebmasterWorld Guest from 54.196.232.162

Message Too Old, No Replies

Artificial vs Organic Links

How does the algorithm detect it?

     
10:55 am on Jul 28, 2006 (gmt 0)

Junior Member

10+ Year Member

joined:Dec 7, 2005
posts:160
votes: 2


Let's say hypothetically I called up every merchant in my industry with a website and somehow persuaded them to link to me.

Now, these would all be highly relevant one-way inbound links except I didn't quite get them 'naturally'. I had to convince them to link to me.

How would the algorithm differentiate it from a set of links which grew organically?

1:31 pm on July 28, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member quadrille is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Feb 22, 2002
posts:3455
votes: 0


Context is a lot of it, probably.

Some suggest that timing matters; a whole bunch of new links every Tuesday is rarely organic.

The anchor text - and the relevance of that to the page - probably matters.

1:41 pm on July 28, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Aug 31, 2001
posts:1357
votes: 0


How would the algorithm differentiate it from a set of links which grew organically?

It couldn't!

If I were designing an algo to try to sense where sites are abusing links what I would do is nominate a few sites for a range of related terms and map their links, in and out. I would then have the algo compare other sites with that pattern. If they deviate significantly from the pattern I'd apply a "penalty".

Sid

3:56 pm on July 29, 2006 (gmt 0)

Preferred Member

10+ Year Member

joined:Jan 4, 2005
posts:621
votes: 0


And hence you are not a search engine...sorry.
3:58 pm on July 29, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:June 19, 2002
posts:1945
votes: 0


lmao!
5:17 pm on July 29, 2006 (gmt 0)

Junior Member

5+ Year Member

joined:Mar 23, 2006
posts:88
votes: 0


If I were Google I would simply enable my organic_link_detector() algorithm. It would work broadly as follows:

1) Engage my automated quality sensor to assess if each site linked to/from was a "quality" site.

2) Engage my humanoid coersion detector to assess if each link was obtained through an individual's natural choice, and reject any links that I detect may have been aquired through an un-natural means, e.g. a request by email, by phone, or even by fax.

3) Engage my duplicate content filter, to weed out any page that contains three or more words strung together in an unoriginal manner.

4) Create a white list of sites that are immune to all of the BS from steps 1 to 3.

 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members