wheel - 3:34 pm on Nov 27, 2011 (gmt 0)
I'm putting together a presentation right now that's caused me to clarify some of my link building. One of the things I'm trying to pound through their heads is the idea of risk assessment. Not coming out and saying go one way or the other, but simply being able to assess the risk so a decision can be made. Here's the idea:
- you must measure three things about the backlinks: authority, relevance, and footprint.
- I measure authority by backlinks - if a site has links from authority sites, then it is an authority.
- I measure relevance from content and backlinks.
Links from sites that are neither authoritative or relevant can increase risk.
- footprints. If something is mechanized, then it is likely to leave a footprint. Just like everyone is trying to reverse engineer Google, Google's reverse engineering your methodology. And Google's smarter than you. they're also better looking. And have more money.
Examples of footprints:
- type of site (i.e. all blogs)
- all low end sites
- look at link attributes for footprints: position on page, anchor text, incontent, there's more.
- one thing that is often overlooked, network footprints. Anytime a closed set of sites is used, a network footprint can be found. Even if the set of sites is large and diverse.
Any time a system is used to generate links (and a system is used almost 100% of the time), that system will leave a footprint of some kind.
Let's say you crawl the web looking for sites that match a specific criteria. You get a weekly list of potential sites and contact them for links. All great, relevant authority sites. No footprint right?
Oh yes there is - all the sites you get backlinks will meet the original criteria from the crawler were sites were selected to be contacted. That's a footprint. And it's probably a footprint a mile wide too.
Assessing the risk then comes in to play based on whether you think Google measures this, as well as if you think Google will measure this in the future.