| 5:18 am on Apr 24, 2007 (gmt 0)|
|With reciprocal links gone, paid links removed, and linking networks subjected to SPAM reports, it looks to me as though the value of links is going to play a big second fiddle to content. |
This suggests to me that high quality, on-topic links will be more valuable than ever.
| 5:28 am on Apr 24, 2007 (gmt 0)|
no, no and no. Links are the base of their algo. I get lots of links and I always rank as long as the domain is trusted. So links + domain are the 2 main thing you need to look at.
| 5:43 am on Apr 24, 2007 (gmt 0)|
>>reciprocal links are now out
Based on a fairly new site, I'm not seeing that happening. But it may well be that kindred topics can carry weight.
| 6:35 am on Apr 24, 2007 (gmt 0)|
=> age of link
=> age of site where link originates
=> source of link
=> link structure of source (link dynamics of source)
HINT: think touchgraph!
=> link structure of landing site (inbound/outbound)
=> quality of content on site where link originates
=> quality of content on site where link lands
=> relevancy of the relationship
Of course, the link is at the core of the web .. so Google can't just devalue this...they have to find new ways to quantify and qualify link relationships...and since they hold reams (what's bigger then a ream?) of historical data on everything they crawl and index...then they can work inwards for the solution...
| 6:40 am on Apr 24, 2007 (gmt 0)|
I dont really think links can somehow disappear. that's algo base.
what Google are doing now is saving on human editor salaries using multiple information channels.
| 7:17 am on Apr 24, 2007 (gmt 0)|
Right on the money Robert
Links are still a high factor doesnít matter which way they are gained recipís, one-ways. As long as they are quality on-topic inbounds.
| 7:57 am on Apr 24, 2007 (gmt 0)|
|Vimes -As long as they are quality on-topic inbounds. |
But not paid links? Yes / No? The only place to get large quantities of links is from brokers or link networks. Both stand out like dog's proverbials.
The difficulty is in getting large quantities of non paid links - i think. The odd authority link here or there is no problem, but medium to large sites will struggle. Yes/No?
I see good established sites with few backlinks [ large sites ] bringing 1-2M plus visitors a month and they are not harvesting much natural linkage. So reliance must be on other factors.
|Crush - I get lots of links and I always rank as long as the domain is trusted. So links + domain are the 2 main thing you need to look at. |
|Crush - earlier comment on other post - Going through the backlinks of some of the mega travel sites I see not one external link and them ranking for some money keywords. So IMO I definately see internals as a BIG factor if you have a big site with a load of trust [webmasterworld.com...] |
I agree with your earlier observation on the link above - frankly I'm seeing sites doing well on highly competitive phrases ranking on content alone - including ours - but we're no "trusted" Wikipedia or Yahoo Answers.
So whilst understanding where you're coming from on this post, you seem to be saying the same thing re "content alone" on the earlier linked post.
How do you match these thoughts in the context of current apparent emphasis by Google?
[edited by: Whitey at 8:23 am (utc) on April 24, 2007]
| 8:33 am on Apr 24, 2007 (gmt 0)|
I have heard somewhere: Link - quality VS quantity
I think that the score is 1-1, I have studied many sites have high ranking for competitive keywords, they do not have any trusted or good links, however they attack The Big G by over 10000 back links from reciprocal partners with spammy anchor text [webmasterworld.com], and they are still very Young, less than 1 year.
Google Do give us many results, however, I guess over 50% of them are coming from spammy sites.
| 9:53 am on Apr 24, 2007 (gmt 0)|
Well the paid links, coming from content related pages would be hard for Google to filter as long as they donít stand out like dog's proverbialís.
If your links are placed under headers like advertising or sponsor then they might be able to filter. If they are placed within text and are on-topic then i don't see how they can be. (i've not read the other thread so i could be way off on that statement)
Itís the old ďgood content gains good linksĒ syndrome, you have the information youíll gain one-way links if you choose to use that content to link out from for recipís etc make sure the links are high quality and on-topic.
but as mentioned by itravelvietnam i edge to the quality being more important than the quantity. Of course if you can get them both your a very happy website :)
Thatís how I see it ATM anyway.
| 3:13 pm on Apr 24, 2007 (gmt 0)|
"This suggests to me that high quality, on-topic links will be more valuable than ever."
Absolutely. And I think it means that, all else being equal , fewer quality links will be able to trump large numbers of relatively lesser quality links. In fact, I'm sure I'm seeing that now in the areas I watch.
| 4:27 pm on Apr 24, 2007 (gmt 0)|
Links are not the base of the algo, they are the base of crawling though.
People forget that there are hundreds of scoring factors that outweigh links.
| 4:34 pm on Apr 24, 2007 (gmt 0)|
I have absolutely no reason to believe google is ranking sites with the best relevant content right now. I still see new cookie cutter sites with cookie cutter content breaking through. They have a long way to go.
| 4:46 pm on Apr 24, 2007 (gmt 0)|
" I have absolutely no reason to believe google is ranking sites with the best relevant content right now."
How do you get a machine to actually understand what a page is about, read between the lines, glimpse the subtext, and perform the same kind of qualitative evaluation that you might receive from a human reviewer that says "this page is better than that one"? At this point, you can't. Even if human reviewers became an integral part of ranking, you wouldn't be safe relying on it too heavily. After all, Roger Ebert is an acknowledged film review expert. Sometimes his opinion purely stinks, though.
You know, even with human quality checkers, you couldn't allow them to have too much influence on results. Probably the most you could safely rely on them for is detecting spam and manipulation that the algorithmns can't catch. And this is reflected in google's strong emphasis on spam detection.
Why are they so focused on spam detection versus providing quality results? Because to some extent that's all they can do.
| 5:37 pm on Apr 24, 2007 (gmt 0)|
|And I think it means that, all else being equal , fewer quality links will be able to trump large numbers of relatively lesser quality links. In fact, I'm sure I'm seeing that now in the areas I watch. |
I've been seeing that for quite some time, and understandably so: Of what value to the user are reciprocal "votes" between, say, a New Zealand real-estate agency and an affiliate site that deals in Irish car rentals? It can't be that hard for Google to discount obvious junk links.
| 10:53 pm on Apr 24, 2007 (gmt 0)|
Why would Google be trying to combat the paid text link phenomenon if they were going to start devaluing the importance of links? That would be like asking someone not to scratch the paint on a car in the junk yard.
| 12:35 am on Apr 25, 2007 (gmt 0)|
|Why would Google be trying to combat the paid text link phenomenon if they were going to start devaluing the importance of links? |
Exactly! Google's main thrust in their algorithm is still based on votes (links) plus trust. That's not going to change anytime soon.
|It can't be that hard for Google to discount obvious junk links. |
I agree. I am no programmer, but I think your example is a good one and even a dumb machine could easily be programmed to suss out those types of off topic and inconsequential links.
I think Google's goal is really very simple. They want to identify "natural and valid" votes and discount all others. Full stop. If I owned a search engine, that's what I would want. I would then set about finding ways and means to identify the natural linking patterns for the various types of businesses.
For example, the travel industry is rife with spammers. But there are millions of legitimate sites too. This is where directories (who have earned trust) come into play. Break down every little area of travel and you will soon discover that there are only 16 restaurants on the island of whogivesadarn. There are only 46 travel related businesses (in total) on that island. The average travel related web site for the island of whogivesadarn has between 200 to 3,000 legitimate IBL's. One site has over 8,000 IBL's and it happens to be the Tourist Board site for whogivesadarn. An authority site.
In comes a new hotel to the magical island of whogivesadarn and suddenly this new site has nearly 6,000 links in the first year. Possible? Yes. Likely? No. Red flags go up, filters kick in and all links are investigated. How many are reciprocal? How many are truly relevant? How many appear to be completely off topic? Identify, identify, identify!
With all the research Google has at their fingertips, it really just boils down to weighting the various factors when trying to identify blatant and even subtle forms of manipulation. A ski resort in Aspen will not have the same linking patterns as a pizza joint in New York City, but there will be many commonalities. A news site will have commonalities in linking patterns to other news sites but not much in common with the pizza joint or ski resort because they are very much a "living" entity versus a static site.
If webmasters are hell bent on getting ahead by buying links ... I strongly suggest that you stick to on topic and relevant links which will appear to be natural to any of the search engines ... over the long haul. Personally, I see no benefit to reciprocal links unless they make sense to your readers.
|Is Google removing an emphasis on links for results? |
No, I don't think they are removing the emphasis on links at all. I think they are just getting really serious about identifying unnatural linking patterns which are intended to manipulate search results. Paid links in a directory are pretty much standard fare and Google is smart enough to recognize that legitimate directories offer legitimate links ... paid or otherwise.
Google are simply trying to do what any of us would do if we owned a search engine: identify those pages which are trying to manipulate search results by any means possible.
They are getting better and better at it and the only lasting solution for webmasters and web site owners (when trying to rank well) is to provide better and better content which people are legitimately interested in and will link to naturally.
Somebody who has a new site and hires and SEO firm to "put them on the map" had better choose their SEO wisely. Cutting corners just isn't going to work if you are in it for the long term.
By all means write the site so that the search engines can correctly identify the content of the page ... but do it without annoying your human audience by keyword stuffing.
Go ahead and buy or swap links from and with legitimate, on topic sites which benefit your audience. Just stay away from swapping links to sites your reader would have no interest in or buying links from sites which are obviously off topic but have a high PR. If you might question the validity of a link when reviewing your own site ... chances are so would Google.
Yeah sure, there are a whole lot of other factors which matter to Google when determining their ranking algo ... but legitimate links are still the best way to identify a great site. As I said, they are getting better and better at identifying legitimate links. To stay ahead, so should we. The best way to do that is provide great content and they will come.