Forum Moderators: Robert Charlton & goodroi
[I don't know this from personal experience, just strongly suspect it to be the case from reading other's observations here.]
If you mean something different, your analoy is not very precise.
Assuming, all links would be highly relevant and one-way inbound, would the algorithm detect this as artificial?
Why is it artificial?
Surely organic links occur when a site owner becomes aware of a page which they decide to link to and add a link to one of their pages. In order for them to become aware of it someone somehow has to tell them about it, a directory, search engine, another page or email will already have a link to the page. They don't just put a random URL in their browser.
Would it be artificial if you sent a press release to a magazine in your industry who published an article about your site that lead to a number of people thinking it was a useful site and deciding to add a link to it. No of course it wouldn't.
IMHO Google has completely the wrong approach to attacking spam in attacking it. They should find ways to make it obsolete.
Sid
How does the algorithm detect it?
No-one here can answer that. You'd have to try and bribe a Google engineer... ;)
All we can really do is speculate on what we would do if we were google. Here are the kind of things that I think I would be looking at. If I had the data that they have, I suspect I could do this pretty accurately. Some of these we can be fairly sure Google are already doing:-
1. Allow links to age. Don't count them until they're X months old.
2. Place more weight on authority status of linking pages. Lots of incoming links from non-authority pages, ignore.
3. If #2 is in high volume (links > X in time frame < Y) then raise a flag.
4. Give new pages X amount of boost in the SERPS for Y period of time to allow people to find them and natural organic links to germinate if they're good pages. Then remove the boost and see what materialises. X and Y would be variables, possibly based on domain authority.
5. Referencing #1, if links never reach X months old, then raise a flag on both linking and linked-to pages.
OK, that's the part that's based on my guess work as to what might help deduce wheat from chuff, but since we have the largest PC cluster on the planet coupled to the most extensive set of search data on the planet, we may as well use it to our advantage, right?
6. Pattern Recognition. I'd probably hand this one over to a bunch of PhD employees, but essentially I could take a multiple set of hand picked sites that we like, and a multiple set of hand picked sites that we don't like and analyse the patterns they create, partly based on 1-5 above (growth rate, link structure off site and in site etc). What we're looking for is a set of data that represents the "average" set of events for a site from being newborn to being something that we want in our SERPS. Conversely, we want an "average" set of events for spam sites from newborn upwards.
Once we have that pattern, we can keep an eye out for it and when we have a "quality site events" match we can stamp it with our "authority boost" scoring system.
Hmmm.... maybe webmasters would see this in action and claim it's a "sandbox". Oh well, who cares - this is about search users not webmasters.
Raised flags would be dealt with by a tool-set - probably a bunch of applications that would be sent out to crawl and analyse pages looking for standard tricks of the spam and link selling trade.
So, that's the link-scoring mechanism for my search engine. What else could you do?
TJ