| 9:00 pm on Dec 10, 2009 (gmt 0)|
Marissa Mayer was the first to indicate they were deprecating links, round about 2003. This was in a session at SES San Jose. I followed up with a question about this and she made it clear, way back in 2003, that Google was deprecating links based on the context of the page.
It's easier to digest if you think about the AdSense technology and how they can match ads to content on the fly, as well as calculate estimated clicks etc.
Does Google deprecate links, I believe they do based on my experience.
Here's the discussion from 2003 [webmasterworld.com] where I reported what I had heard.
[edited by: martinibuster at 9:08 pm (utc) on Dec. 10, 2009]
| 9:06 pm on Dec 10, 2009 (gmt 0)|
The main reason to place a link on a relevant site is to attract an interested visitor.
If you consistently aim at that, the overall signals of relevance that develop within your link profile will reinforce your own theme with pretty good clarity.
Not every link needs to be perfectly, obviously relevant, as long as your overall link profile is sending enough signals that help the spiders decide what your page is about.
| 11:24 pm on Dec 10, 2009 (gmt 0)|
You don't need to fear the computer. You need to fear a hand review. A real person can certainly tell the difference between on and off topic links.
Still, I disagree that they can't tell. You'd think they can't tell a lot of things until you see a visual link graph. non-intuitive stuff becomes very very clear. And it doesn't have to be about genres. if your site is about SEO and the words SEO appear on the same page as the pages that link to you, that doesn't seem to be very difficult to decipher. Going from that obvious step to determing whether a page has information 'related' to SEO doesn't seem like a stretch.
Still, I think the jury's out. I wouldn't call it complete BS. And even if it is, remember the hand review.
| 10:51 pm on Dec 12, 2009 (gmt 0)|
From my experience, and from what I see active daily in the genres I watch, a certain percentage of relevant links are required. A certain threshold of trust needs to be crossed.
After that, a whack of totally off topic links from blogrolls will get you what you want and keep you there.
Realistically hand reviews are the antidote here, but it would seem that the justice never gets handed down.
I would love to setup a controlled experiment around this and publish the results so that Google could take a serious look at it, but much of WebmasterWorld probably already knows what the results would be.
Google, in my opinion, has a very difficult time knowing how to apply value from links, determine relevance, and apply depreciation. Often good sites get hit incorrectly.
Still, I am a big believer of relevant only, and hopefully one day Google does too :)
| 6:05 am on Dec 13, 2009 (gmt 0)|
.... if you get past the automated link profiling from Google of authority and relevant links and start to add lots of blogroll links etc , you may not pass the editorial hand checks if they come along.
Trophy terms stick out like canines proverbials and are sure to be flagged by all and sundry. So stick to relevant links and the rest in moderation so that you don't provoke a reviewer to penalise you.
| 7:11 pm on Dec 13, 2009 (gmt 0)|
|you may not pass the editorial hand checks if they come along |
And, conversely, you might pass an editorial check, and likely seal in your tenure in the top positions.
Which is, unfortunately why we see so much of it happening in Google. Guys are willing to take that bet that they won't be dropped. Not a chance I would like to take!
| 6:07 pm on Dec 14, 2009 (gmt 0)|
Thank you guys for such thoughtful responses. I apologize for the sarcastic post, but my logic was still grounded :)
It seems to me that everybody agrees that you still should stick to "relevant" sites, but the jury is still out as to exactly how and if Google will penalize links that aren't.
Martinibuster: I like the Adsense theory. However I regularly get Adsense links that are off-the-wall, which makes sense (Google has billions of these things running). If anything, I think they sometimes show Google's inability to make an exact determination of page content.
Wheel: What sort of link graph are you talking about? just curious about these.
CainIV: I agree with you, a test would be great. It is also a shame that good sites are getting hit incorrectly.
I guess I should mention why I asked this initially. One of the basic tactics I use to linkbuild is searching the web for instances of my keywords appearing in-content.
Sometimes my keywords will show up in a random high PR blog post that is not necessarily in my site's vertical. But of course I want the link! I can't imagine that Google would have the time, resources, or reliable enough algorithms to discount a link like that with any certainty.
| 7:03 pm on Dec 14, 2009 (gmt 0)|
There's what works, and what works now. Don't confuse the two.
Relevant may or may not work now. Irrelevant may or may not work now. But if you were to make a bet, which one do you think has a high probability of being only 'what works now'?
There's nothing wrong with what you're doing. Just appreciate the risk - Google doesn't want you to do what you're doing. Do they at some point take the time to figure it out - and if they do what happens to your site?
Everyone's giddy with paid links, automated, etc. But every few years everyone figures this stuff out and goes nuts on it as the new, easy way to rank. And it's like predicting a stock market crash. I can keep predicting Google's going to catch on and wipe a lot of sites off the face of the earth - and if I wait long enough I'll be right :). Remember the directory crash? Remember when recips got killed? I remember. It's been a while so maybe some folks are new enough to not remember. But it's pretty catastrophic when it happens - entire swaths of large sites gone from the serps for a long time if not forever.
Do they want you to develop relevant links? I believe they do.
| 7:14 pm on Dec 14, 2009 (gmt 0)|
>>>I think they sometimes show Google's inability to make an exact determination of page content.
That's not a reflection of Google's inability to determine relevance. It's a reflection of Google's ability to predict the highest paying click. In the case of AdSense, what's going on is they are ALSO making a best guess as to what will get the highest paying click or the most clicks.
The basic relevance matching is solid, which is why I use AdSense as an example. The odd occurrences where they appear to get it wrong doesn't have to do with matching content to content, it has to do with estimating revenue to be earned from showing a high clickthrough or high paying ad. I have seen a number of cases where the high CPC caused an ad to be displayed in the wrong context. Remove CPC from the equation and the matching is solid.
The task of juggling the relevancy of ads to content then factoring CPC etc. is still a good example of how smart Google is. Doing this for links is, I think, easier because there is no CPC/auction calculation to skew the results.
| 9:46 pm on Dec 14, 2009 (gmt 0)|
Very good point Martinibuster, I guess I didn't think of it from the CPC angle. I guess I can't blame them for trying to make a buck!
And Wheel...you are making me scared! I guess I wasn't around to see the directory and reciprocal links get smashed, I sure hope my efforts don't end up the same. Luckily I have acquired many different types of links, which I think will help cushion any blow that may come in the future.
Thanks guys :)
| 10:03 pm on Dec 14, 2009 (gmt 0)|
|Luckily I have acquired many different types of links, which I think will help cushion any blow that may come in the future. |
That's certainly one thing I try and do. I figure if I have links of type A B and C, and Google dials down type A and turns up the dial on type C, then I rank because of my C links - so I still rank.
Alternatively, it's just integrating your website into the fabric of the web via links from as many diverse places as one can. trying to beat google by thinking about it much more than that I think is a futile game longterm. In the shorterm one can still get easy rankings and make some cash.
| 6:58 am on Dec 17, 2009 (gmt 0)|
Google kind of looks at a link to see if the link is related. It is very general and they only depreciate when it is very far off. This does not mean that the links are worthless they just have less value. It is not just as simple as related. There are many factors that go into how much value a link has. Anchor text still has a lot of value. If you get just a link with just your domain or "click here" the value is much lower than if you got a link with keywords as anchor text.
Link building is important. I would not just skip getting a link. I might put it off until things slow down on the related websites.
| 6:15 pm on Dec 17, 2009 (gmt 0)|
|If you get just a link with just your domain or "click here" the value is much lower than if you got a link with keywords as anchor text. |
I don't believe so. The keyword anchor or lack of keyword anchor, imo, only change the nature of the link, not how effective it is. The value is the same.
| 3:42 pm on Dec 18, 2009 (gmt 0)|
I always just assumed that the keywords from page would need to have some matches on the keywords on the page giving the link.
| 7:47 pm on Dec 18, 2009 (gmt 0)|
No one seems to have mentioned AdPlanner - sites are categorized independent of DMOZ
| 9:15 pm on Dec 18, 2009 (gmt 0)|
Using the K.I.S.S. method
#1 - search figures out your sites primary topic.
#2 - search figures out how many of the related keywords your site covers that it believes are related.
#3 - search assigns your site a score.
#4 - search slightly increases your score if incoming links are of good score AND also highly related.
That's overly simplistic but the message often missed is that everything is related. If your site is about widgets but you cover none of the subjects normally related to widgets you start off with a low score and if incoming links come from these they provide less benefit.
Every page on your site is a standalone page in terms of its merits but also plays a role on your overall site. If all of your pages cover topics related to your main (as judged by google) you get an excellent score. If your site is overly diluted or you have 100 pages fighting for the same keyword when only ONE can be considered about that keyword (and only one returned in search for it) you get a lower "score" overall (even if individual pages get great scores) and pass/receive less cumulative value.
The days of a link being a link are over. If you want to build the perfect site you can think of it as a Christmas tree. Your main keyword is the angel up top and you branch out from there with related keywords. One page per keyword with spot on page titles and no "check out this super cool new pink widget" stuff if pink widget is the content, those 6 extra words serve only to confuse unless you really think someone will ever type all that in verbatim.
How many pages on your site have variations of the same keyword? Wouldn't a single page about that keyword be easier to use as a visitor?
| 12:33 pm on Dec 19, 2009 (gmt 0)|
"The main reason to place a link on a relevant site is to attract an interested visitor. "
I hope this doesnt come off as sarcasm, but is that really the main reason to place a link on a relevant site for most webmasters? Whenever I asked about this topic (Im a fanboy of link traffic, b/c G's moods cant kill it ;)) in the past, it seemed that everyone agreed that whatever you do the benefit from a link usually lies in the SE traffic it brings, in comparison to which the value of the actual link traffic usually pales.
I would say unfortunately the main reason to get a link on relevant s ites is probably more that y ou dont want to piss off G :-)
| 6:39 pm on Dec 20, 2009 (gmt 0)|
Why is it hard to believe that Google can determine whether a page linking to your URL is relevant? Think about it...
How does Google determine if YOUR URL is relevant to the user's search phrase? They look at on page factors... like title, h1, content, etc. to see if the search phrase, slight variations of the search phrase, partial matches for the search phrase, terms related to the search phrase, etc. appear in the various page elements. Then they look at off-page factors... like link text of the inbound links and how much PR/link juice is being passed in via those links.
So why wouldn't they do the same to determine if the page linking to your URL is relevant to the search phrase? Look at on-page factors on the page linking to you like title, h1, content, etc. to see if the search phrase, slight variations of the search phrase, partial matches for the search phrase, terms related to the search phrase, etc. appear in the various page elements of the page linking to you. Then they look at off-page factors for the page linking to you... like link text of the inbound links too the page linking to you and how much PR/link juice is being passed in via those links.
In otherwords, all they really have to do is see how each of the pages linking to you rank for the same keyword phrase they are ranking your URL for. The higher they rank, the more relevant... assuming their algorithm already ranks URLs for the SERPs based on relevance to the user's search phrase.