Welcome to WebmasterWorld Guest from 18.104.22.168
Clearly it's hit a nerve with the amount of activity on Matt's blog and here at WebmasterWorld.
Here's my observations over the last year:
-Big Daddy - reciprocal links are now out [ or of very little value ]; more emphasis on content integrity with duplicate content filters being tweaked.
-"Authority" content is scoring progressively more favourable results, Wikipedia, Yahoo Answers , G**tree etc etc
-Matt Cutt's issues warning on the use on "no-follows" on link advertising.
With reciprocal links gone, paid links removed, and linking networks subjected to SPAM reports, it looks to me as though the value of links is going to play a big second fiddle to content.
=> age of link
=> age of site where link originates
=> source of link
=> link structure of source (link dynamics of source)
HINT: think touchgraph!
=> link structure of landing site (inbound/outbound)
=> quality of content on site where link originates
=> quality of content on site where link lands
=> relevancy of the relationship
Of course, the link is at the core of the web .. so Google can't just devalue this...they have to find new ways to quantify and qualify link relationships...and since they hold reams (what's bigger then a ream?) of historical data on everything they crawl and index...then they can work inwards for the solution...
Vimes -As long as they are quality on-topic inbounds.
But not paid links? Yes / No? The only place to get large quantities of links is from brokers or link networks. Both stand out like dog's proverbials.
The difficulty is in getting large quantities of non paid links - i think. The odd authority link here or there is no problem, but medium to large sites will struggle. Yes/No?
I see good established sites with few backlinks [ large sites ] bringing 1-2M plus visitors a month and they are not harvesting much natural linkage. So reliance must be on other factors.
Crush - I get lots of links and I always rank as long as the domain is trusted. So links + domain are the 2 main thing you need to look at.
Crush - earlier comment on other post - Going through the backlinks of some of the mega travel sites I see not one external link and them ranking for some money keywords. So IMO I definately see internals as a BIG factor if you have a big site with a load of trust [webmasterworld.com...]
I agree with your earlier observation on the link above - frankly I'm seeing sites doing well on highly competitive phrases ranking on content alone - including ours - but we're no "trusted" Wikipedia or Yahoo Answers.
So whilst understanding where you're coming from on this post, you seem to be saying the same thing re "content alone" on the earlier linked post.
How do you match these thoughts in the context of current apparent emphasis by Google?
[edited by: Whitey at 8:23 am (utc) on April 24, 2007]
Google Do give us many results, however, I guess over 50% of them are coming from spammy sites.
but as mentioned by itravelvietnam i edge to the quality being more important than the quantity. Of course if you can get them both your a very happy website :)
Thatís how I see it ATM anyway.
Absolutely. And I think it means that, all else being equal , fewer quality links will be able to trump large numbers of relatively lesser quality links. In fact, I'm sure I'm seeing that now in the areas I watch.
How do you get a machine to actually understand what a page is about, read between the lines, glimpse the subtext, and perform the same kind of qualitative evaluation that you might receive from a human reviewer that says "this page is better than that one"? At this point, you can't. Even if human reviewers became an integral part of ranking, you wouldn't be safe relying on it too heavily. After all, Roger Ebert is an acknowledged film review expert. Sometimes his opinion purely stinks, though.
You know, even with human quality checkers, you couldn't allow them to have too much influence on results. Probably the most you could safely rely on them for is detecting spam and manipulation that the algorithmns can't catch. And this is reflected in google's strong emphasis on spam detection.
Why are they so focused on spam detection versus providing quality results? Because to some extent that's all they can do.
And I think it means that, all else being equal , fewer quality links will be able to trump large numbers of relatively lesser quality links. In fact, I'm sure I'm seeing that now in the areas I watch.
I've been seeing that for quite some time, and understandably so: Of what value to the user are reciprocal "votes" between, say, a New Zealand real-estate agency and an affiliate site that deals in Irish car rentals? It can't be that hard for Google to discount obvious junk links.
Why would Google be trying to combat the paid text link phenomenon if they were going to start devaluing the importance of links?
Exactly! Google's main thrust in their algorithm is still based on votes (links) plus trust. That's not going to change anytime soon.
It can't be that hard for Google to discount obvious junk links.
I agree. I am no programmer, but I think your example is a good one and even a dumb machine could easily be programmed to suss out those types of off topic and inconsequential links.
I think Google's goal is really very simple. They want to identify "natural and valid" votes and discount all others. Full stop. If I owned a search engine, that's what I would want. I would then set about finding ways and means to identify the natural linking patterns for the various types of businesses.
For example, the travel industry is rife with spammers. But there are millions of legitimate sites too. This is where directories (who have earned trust) come into play. Break down every little area of travel and you will soon discover that there are only 16 restaurants on the island of whogivesadarn. There are only 46 travel related businesses (in total) on that island. The average travel related web site for the island of whogivesadarn has between 200 to 3,000 legitimate IBL's. One site has over 8,000 IBL's and it happens to be the Tourist Board site for whogivesadarn. An authority site.
In comes a new hotel to the magical island of whogivesadarn and suddenly this new site has nearly 6,000 links in the first year. Possible? Yes. Likely? No. Red flags go up, filters kick in and all links are investigated. How many are reciprocal? How many are truly relevant? How many appear to be completely off topic? Identify, identify, identify!
With all the research Google has at their fingertips, it really just boils down to weighting the various factors when trying to identify blatant and even subtle forms of manipulation. A ski resort in Aspen will not have the same linking patterns as a pizza joint in New York City, but there will be many commonalities. A news site will have commonalities in linking patterns to other news sites but not much in common with the pizza joint or ski resort because they are very much a "living" entity versus a static site.
If webmasters are hell bent on getting ahead by buying links ... I strongly suggest that you stick to on topic and relevant links which will appear to be natural to any of the search engines ... over the long haul. Personally, I see no benefit to reciprocal links unless they make sense to your readers.
Is Google removing an emphasis on links for results?
No, I don't think they are removing the emphasis on links at all. I think they are just getting really serious about identifying unnatural linking patterns which are intended to manipulate search results. Paid links in a directory are pretty much standard fare and Google is smart enough to recognize that legitimate directories offer legitimate links ... paid or otherwise.
Google are simply trying to do what any of us would do if we owned a search engine: identify those pages which are trying to manipulate search results by any means possible.
They are getting better and better at it and the only lasting solution for webmasters and web site owners (when trying to rank well) is to provide better and better content which people are legitimately interested in and will link to naturally.
Somebody who has a new site and hires and SEO firm to "put them on the map" had better choose their SEO wisely. Cutting corners just isn't going to work if you are in it for the long term.
By all means write the site so that the search engines can correctly identify the content of the page ... but do it without annoying your human audience by keyword stuffing.
Go ahead and buy or swap links from and with legitimate, on topic sites which benefit your audience. Just stay away from swapping links to sites your reader would have no interest in or buying links from sites which are obviously off topic but have a high PR. If you might question the validity of a link when reviewing your own site ... chances are so would Google.
Yeah sure, there are a whole lot of other factors which matter to Google when determining their ranking algo ... but legitimate links are still the best way to identify a great site. As I said, they are getting better and better at identifying legitimate links. To stay ahead, so should we. The best way to do that is provide great content and they will come.