|How can search engines even identify 3 way links? |
If Americans can put a man on the moon with the equivalent of a Nintentdo handheld, Google and Yahoo can handle graphing a three way link exchange.
There's a paper over at Yahoo Research that talks about mapping billions of edges and identifying spam link networks (I posted a link to that in the past).
In addition to digging up that paper, you may also make a point of speaking with the Googlers at next weeks's pubcon (you are going, right?). If you do, you will walk away with a better idea of what can and can't be done. ;)
For those who think they might not get anything out of pubcon, think again. There's a lot to be learned over there. And yes, 3 way linking is cheating.
See you at PubCon!
[edited by: martinibuster at 6:32 pm (utc) on April 11, 2006]
Here's the abstract, and a link to where you can obtain the paper:
|We present a new algorithm for finding large, dense subgraphs in massive graphs... and is extremely efficient, capable of handling graphs with tens of billions of edges on a single machine with modest resources. We apply our algorithm to characterize the large, dense subgraphs of a graph showing connections between hosts on the World Wide Web; this graph contains over 50M hosts and 11B edges, gathered from 2.1B web pages. We measure the distribution of these dense subgraphs and their evolution over time. We show that more than half of these hosts participate in some dense subgraph found by the analysis... Upon examination, many of the dense subgraphs output by our algorithm are link spam, i.e., websites that attempt to manipulate search engine rankings through aggressive interlinking to simulate popular content. We therefore propose dense subgraph extraction as a useful primitive for spam detection, and discuss its incorporation into the workflow of web search engines. |
And for Google, this PDF! The Second Eigenvalue of the Google Matrix [citeseer.csail.mit.edu]
|Spam Detection. ...In particular, each pair of leaf nodes in the SCC graph for the chain P corresponds to an eigenvector of A with eigenvalue c. |
These leaf nodes in the SCC are those subgraphs in the web link graph which may have incoming edges, but have no edges to other components. Link spammers often generate such structures in attempts to hoard rank. Analysis of the nonprincipal eigenvectors of A may lead to strategies for combating link spam.
It's all about pattern detection. Given enough data, the patterns discovered provide reliable means of detecting natual and unnatural behaviors.
I didn't even want to comment on this. But I know for a fact that, if done right small, independent, properly simulated networks work.
I don't think anyone would dispute that they work, just what the consequences of discovery might be ;-)
I guess it's the of personality =)
If you want to stay under the radar at all times then you shouldn't.
I for instance like to experiment and must admit had several sites banned when I got greedy. But when I controlled myself - never had any troubles.
The better question is: how can it not be cheating?
*But when I controlled myself - never had any troubles.*
Therein lies the lesson, IMO.
The answer is six months.
Google no doubt work on budgets, and business plans. If 3 way links interfere witg adwords etc then they will raise the funds to alter the algorhythym to seek out sites that are basically 3 way link farms.
Personally I dont think they have done this yet but are relying on scare tactics.
Also most 3 way links are done on links pages which count squat to competitive SEO why bother anyway.
If there are more than 6 or 7 links on the page personally I wouldnt bother