|Link Bait - Valid Strategy or Social Engineering Scheme?|
Is Link Baiting a Search Engine Approved Technique?
Technically, a vote for a site is a vote for it's content. But when someone does a link bait thing that attracts links to the link bait page, does that link represent a vote for the content of the site or is the link bait link a valid vote for the page and the entire site regardless of what else is hosted on the site?
Is link bait simply an exercise in social engineering for the purpose of getting others to link to a site? Is that really what the search engines had in mind when they began counting votes as a measure of how authoritative site content is? At what point does a site stop being authoritative and is simply well linked because they have an aptitude for churning out link bait? Should a search engine regard a site as authoritative because they offer great top ten lists?
The more I think about it, the more I'm convinced that link baiting links do not represent a vote for the content of a site. I'm leaning toward believing that link baiting links shouldn't count for a sites link popularity score. Link baiting isn't about the quality of a site's widgets for sale. Link baiting isn't about their customer service. Link baiting isn't about how deep the information about widgets is at the site. Link baiting is merely a method for amassing links regardless of the quality of the content. But for the search engines to accurately determine relevance, shouldn't links to a site be all about the quality of the site content? If assigning links is about the quality of the content and its relevance to a query, then I don't see why a search engine would regard links from a link bait technique differently from links obtained from an artificial link scheme such as a site counter.
Never mind what I think. I want to hear what you think. Link bait links should count as votes for the content and the quality of the site? Or link bait links are only links to the bait and should not be construed as votes for the quality of the site?
i can see your point if the site is about selling red widgets and the link bait is about repeening gazornaflats.
however what if the link bait is at least peripherally relevant, such as testing your knowledge about lowering the cost of widgets or identifying vintage shades of red?
|...what if the link bait is at least peripherally relevant, such as testing your knowledge about lowering the cost of widgets or identifying vintage shades of red? |
Let's assume a particular site has rubbish for content but features great link bait. Is it in the best interest of the search engines to rank the low quality content because the site features peripherally relevant link bait? Or is it in the best interest of the search engines to rank sites that attained links because of the high quality of their content?
|Is it in the best interest of the search engines to rank the low quality content because the site features peripherally relevant link bait? |
No it isn't, but I think the issue is that Google still has lots of work to do in the assessment of what type of content is considered 'quality' and what type is considered 'lousy'
The difficult part is we are asking questions from within the context of an engine that DOES place trust and authority on pages that are well linked to, from trusted resources. In my opinion - regardless of what that content actually says, or what value it truly does hold from a user perspective.
I think the day will come when Google will be able to actually make this distinction. As ranking becomes more and more 'micro' (it wasnt so long ago that x great links pointed at the homepage of a website ranked EVERY category on that website top 10 by default) I would assume this is a an area Google will heavily examine and move into.
Currently, I think it might be well beyond the processing power they currently have, and possibly even beyond the data they have collected, to be able to make those distinctions on a per website basis
Matt Cutts wrote a post in 2006 [mattcutts.com] about link bait, expressing approval of good content, like an article, as a form of good link bait.
|...content can be both white-hat and yet still be wonderful "bait" for links... And generating information or ideas that people talk about is a surefire way to generate links. |
But isn't that restating that good content is what search engines are interested in? Matt said nothing about fluff like top ten lists or humorous quizzes. Zero. Read the post for yourself.
Creating good content is creating good content. For marketing purposes you can call it link baiting, but at bottom it's just good content. The more good content you have the more links you will get, especially if you tell others about it. That's a no brainer. There is nothing new or game changing about that.
But Link Bait as a link strategy goes deeper. Link baiting has been accepted by our industry with only superficial consideration. It's time to take a serious look at it because it's our duty to investigate whether something is useful or not.
Search Engines use the web graph to determine what content is relevant. This is why links are referred to as "votes" because each link is a vote for the content it is linking to. However you cannot use that explanation to justify viral links. It is naive to think that every link should and does count. The reality is that every link is not equal. The truth is that some links count very little and some links don't count at all. Does Google dampen viral links? Yes, there is no question about that. It's a known fact, for example, that Google dampens the viral links from Google Bombing [google.com]. Google bombing is an example of a viral link situation where Google devalues the links. It contradicts the assertion that Google does not devalue viral links. Google does indeed devalue viral links.
Links do not necessarily equal a vote for the content it is linking to. Links do not always indicate authority. This is why Google analyzes the context of the link, to find out why one site links to another. Google does this to return relevant SERPs. Google analyzes the links to determine the reasons why a piece of content is being linked to. It is untrue to assert that dampening viral links will negatively affect the link graph. It would be a slippery slope if Google did not dampen viral links.
Google bombing is an example of a viral linking event in which Google takes steps to dampen and devalue [searchengineland.com] the effect of the links. The truth is that Google has been examining the context and relevance of links then dampening the PageRank passed since at least 2003. The truth is that we know for certain that Google devalues irrelevant links. The truth is that Google dampens the value of links in the footer. The truth is that it is a stretch to assert that Google does not devalue viral links. It's clear that Google analyzes then dampens and devalues links in general. Why would viral links be exempt from the dampening and devaluation process that all other links are subject to?
It's a common assumption that Google cannot devalue link bait links because it will harm the link graph. That is either ignorance or an untruth. If Google can devalue sitewide links, if Google can devalue links from irrelevant link parternships, if Google can devalue links in a footer, if Google devalues Google bomb links, then it becomes easier to see that Google can devalue link bait links to fluff content like top ten lists, funny photos, and quizzes.
Good content that receives relevant links from other sites get rewarded. This is the classic link bait that Matt discusses in his blog post. But not all link bait consists of good content. As a consequence not not all links represent a vote for quality. This is why you will often see references to the amount of links gained by a particular link bait campaign, but rarely if ever any mention of an improvement in rankings.
Comes back to
|Google still has lots of work to do in the assessment of what type of content is considered 'quality' and what type is considered 'lousy' |
If spam links can help you rank (and they do) then even lousy link bait links can help you rank.
I guess it all depends who is baited by your link:
1) Poor Quality Approval / Less Relevant Approval
e.g. social profiles (that could be fake) or tuppeny-hapenny blogs (that could be fake) or forum threads (that could be link drops)
2) High Quality Approval / More Relevant Approval
e.g. authority/hub sites that link to your bait from within their content or from their own linkbait (i.e. a list of top SEO tools for example) or a respected blogger/columnist who is so impressed by your linkbait that they write an article about it
|Let's assume a particular site has rubbish for content but features great link bait. |
i see two things that come into play here:
- the link graph
- quality and relevancy of linked content
if all the inbound links go to one url and especially if that is not the home page of the site then that says something about the quality of the remaining pages of the site.
i say throttle the internal rank passing based on quality and relevance.
maybe treat the link bait page as a subdomain that is linking out to a junk subdomain.
The real issue here to me martini is that Google still cannot accurately assert and establish relevancy when it comes to links. Nor have they fixed Google bombing.
|The truth is that we know for certain that Google devalues irrelevant links |
Perhaps sometimes, but not in an accurate enough way, consistently that I would backup that statement.
The whole crutch of paid links and reporting is so that others can 'help' Google ascertain websites when they are pushing those borders and the Google algo cannot detect it. Which is, in my opinion, about 75% of the time.
In most of the genres I watch I see at least 1-2 websites who have thrown paid blog posts (unrelated) and hundreds, if not more, of unrelated blogroll links at their problem. If you minus those links from their backlink profile, they have 1-5 good links, maybe a Yahoo directory listing. I can give you lots of examples of websites that have seen 'bursts' of irrelevant inbound links and have benefited from this. Paid, unbaked, viral or otherwise.
Other websites in these sectors that are high quality and have been around longer, have lost positions to these websites even though they have much more authority, prospective link bait and better, older, more relevant links (by a long shot)
Although I would love to 'believe' that relevancy is a serious factor, on the street it just isn't adding up.
Not a rant, and I am certainly not suggesting that Google 'shouldn't' move to a system where it can better detect viral links, or segment ranking score by document and not apply haphazardly across entire domains, but I simply don't have faith right now in G's ability to ascertain the meaning.
|...Link bait links should count as votes for the content and the quality of the site? Or link bait links are only links to the bait and should not be construed as votes for the quality of the site? |
To pick up on the "should" vs "does" aspects of this discussion... Google is big on intention and I'm sure would love to use intention where it can. I think its ability to do so here is iffy at best. For a given search, Google would not only need to be able to discern the intention of the links, but also the intention of the searcher and the intention of the page/site content. This is far beyond what Google appears to be capable of doing.
That said, I wouldn't completely rule out Google differentiations among various linking patterns. In determining whether the links are votes for "quality", eg, the relationship of temporal and (query independent) trust factors may well play a part. A flurry of blog links for one article might be classified very differently than, say, a steady accumulation of links of from different kinds of sources at varying intervals. Link bait links may well have discernible patterns which Google can spot, and this might enable Google to weight them differently from other links. Obviously, all sorts of other factors apply. These might be factors that Google can track now, and/or will soon be able to track or compute more easily with the Caffeine file system in place.
There are also different kinds of queries which Google is tracking now. Queries deserving freshness, eg, might possibly be getting more boost from viral links than queries not requiring freshness.
Obviously Matt isn't going to be divulging secret sauce, but something he comes back to repeatedly is, I feel, central to the secret sauce... or at least to Google's goals regarding the recipe... that, in the long run, sites which work best for users tend to work best for Google.
So, without making distinctions about how Google classifies links, Matt lays out a scenario which suggests that sites based on interesting or useful information are going to win out, simply because they will get more (and perhaps better) link votes over time....
|...Personally, I’d lean toward producing interesting data or having a creative idea rather than spouting really controversial ideas 100% of the time. If everything you ever say is controversial, it can be entertaining, but it’s harder to maintain credibility over the long haul. |
Only in a webmaster forum would strong marketing be called 'social engineering'. :)
Compare these two marketing methods:
1) website has some cool content. Webmasters link to that site, people come and buy stuff.
2) Tiger/Shaq/et al get paid a bajillion dollars to wear Nike. Everyone buys Nike products.
p.s. nobody buys Nike because of quality. Perhaps they're good quality, perhaps they're not. Either way, it's irrelevant to what consumers want.
A link bait scheme is a valid marketing scheme, and the resulting links are as much a citation as any other.
There's no way we're going to get into shades of citations, like 'i like your site because of A' is better than 'I like your site because of B'. The SE's cant' even figure out 'I like your site because of C' where C is cash.
A link bait need not be the notorious kinds we have come to know. Even an original article, or breaking a news story in my niche can be a link-bait which will get all people in my industry to link to me.
There is then a thin line of difference between a link-bait and news-breaker..
What constitutes "link bait" is in the eye of the beholder, isn't it? How is a search engine supposed to judge the site owner's motives?
When my laser printer broke down, I found a site with a great "how-to" article about my printer's specific problem and how to fix it. By happy coincidence, the site sold a parts kit (with instructional video) for that DIY repair. One could argue that the how-to article was merely link bait (I'm sure it got plenty of links, and it ranked high in Google), but one could also argue that kit sales were like advertising on an editorial site: i.e., a way to support the Web publisher's "how-to" writing. In the end, it didn't matter: I got my problem solved, the site owner got a sale, and Google's search results provided a good "user experience."
[edited by: signor_john at 3:27 pm (utc) on Dec. 14, 2009]
In some countries, $50 qualifies as 'link bait' :).
Link bait maybe, but what's google to do, they are so many link schemes google has to give up all but the most obvious ones.
If you have 500 good links but 5 or 50 suspicious ones, what to do, ban your domains completely? If I put my domain name on my profile or signature is it a vote for my content?
Or, if cnn buys a site and adds a sitewide link to it, what's that?
|Link baiting is merely a method for amassing links regardless of the quality of the content. |
Interesting observations Martinibuster. There are so many definitions of "link bait". At the end of the day its all about relevance, relevance, relevance. If I link to a site that has good link bait but I don't realize its link bait, I might simply just be seeing it as useful content for my end user, a link is still a link.
This is similar to deep linking where I ask the user to link to a specific internal page on my website (instead of my home page) when they request a link from a specific category on my links page. I have new sites that rank very well from the deep links. The end user making the link may very well see my internal page that is related to his own genre "link bait" but I see it more as a traditional deep link.
So one's own definition of link bait is going to directly affect how one thinks it should "count as votes for the content and the quality of the site".
Not meaning any disrepect, I try not to over think these things. Good content attracts good links. Good deep linking opportunities attracts great link partners. Relevance is the name of the game.
Just got off the phone with a customer who has a whale watching business and I was explaining to him how important it is to create some good content on his site to get decent links. I told him this is sometimes called "link bait".. he laughed because he thought I was making a pun on fishing. We talked about a free web cam on his site (for whale watching) because it would attract relevant quality links from local area businesses targeting his same market segment. You and me might call that link bait but really its just good ol fashioned free quality content.
At the end of the day, I don't care what the search engines think. I publish as much decent content as I can to attract the links and I reciprocate my links (when relevant) anytime the other party asks for the link back.
I have the feeling that the value of a link is tied to the value of the content page from which it emanates.
What's good content today? Same as it always was. Content that provides answers and insights and stimulates discussion.
From a measurement perspective for an indexer of content like Google, it would these days, I think, have at least a bit to do with how long the user stayed on the page, and, more importantly, A) whether or not the user returned to the site and B) whether or not the user went deeper into the site.
Gauging the value of the content page as a giver of an outgoing link or vote should A) have less to do with how many incoming links it has itself since links are a cheap commodity and B) more to do with the measured type and level of interaction between the user and the content page.
Having said that, content pages, whether heavily linked to or not, that fulfill the definition of good content, objectively measured with some type of quality score, should be considered by an indexer as a good link source for a referred page.
The basic idea is that links are not required for a content page to be well-regarded by Google, yet links from pages that are well-regarded can provide great juice to referred pages.
I think the implication is that it becomes increasingly important to keep writing good original content and to place it on your own site for the purpose of building site identity, i.e. the relevance of the site to its topic matter, while at the same time building the value of the site's internal links to its own pages. Most interior pages of a site will have relatively few incoming links from external sites, yet these same pages can heavily reinforce other pages on the site, more so if those pages satisfy a quality threshold in an index: gained a visitor and provided content that was good enough to spur deeper travels into the site and/or repeat visits to the site.
On the myriad long tail terms, I think this would be less of an issue, but for a site's bread and butter keywords and phrases, those in high competition, I would think these considerations would tip the scale.