homepage Welcome to WebmasterWorld Guest from 54.167.173.250
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 32 message thread spans 2 pages: 32 ( [1] 2 > >     
Matt Cutts: Search Results Look Worse Without Links As Ranking Factor
aakk9999




msg:4646822
 10:44 pm on Feb 19, 2014 (gmt 0)

Matt Cutts has been asked a question whether Google has a version of SERPs which excludes backlinks as a ranking factor. Matt replied Google has experimented internally and that the quality of search results look worse.

Is there a version of Google that excludes backlinks as a ranking factor
https://www.youtube.com/watch?v=NCY30WhI2og [youtube.com]

So we don't have a version like that that is exposed to the public but we have our own experiments like that internally and the quality looks much much worse. It turns out backlinks, even though there is some noise and certainly a lot of spam, for the most part are still a really really big win in terms of quality of search results.

We played around with the idea of turning off backlink relevance and at least for now backlinks relevance still really helps in terms of making sure that we turn the best, most relevant, most topical set of search results.


[edit reason]Spelling[/edit]

[edited by: aakk9999 at 11:51 pm (utc) on Feb 19, 2014]

 

Shepherd




msg:4646829
 11:05 pm on Feb 19, 2014 (gmt 0)

5000+ link building is dead articles just got re-written!

brotherhood of LAN




msg:4646830
 11:26 pm on Feb 19, 2014 (gmt 0)

If only they could identify who is giving and receiving links, that'd be extremely powerful for them. Until then it seems their powerful algorithms are reduced to straining link soup to identify the good and bad. Taking it up a level to who is 'owner' and 'receiver' of the links could be an excellent avenue IMO.

Shame they don't have a means to do that, though. I think authorship and knowledge graph could be the direction Google would like it to head.

aristotle




msg:4646844
 12:22 am on Feb 20, 2014 (gmt 0)

Well if you spend 15 years tweaking you algorithm around backlinks, obviously the results will look worse if you abruptly take backlinks out of it.

tangor




msg:4646846
 12:56 am on Feb 20, 2014 (gmt 0)

And hence, all these algo updates in the last few years which make us all crazy. (sigh)

EditorialGuy




msg:4646874
 1:47 am on Feb 20, 2014 (gmt 0)

Of course search quality would suffer without citations (a.k.a. backlinks) as a signal. The question isn't whether Google should include backlinks in the algorithm, it's how to distinguish the genuine citations from the dreck.

Awarn




msg:4646891
 2:28 am on Feb 20, 2014 (gmt 0)

The thing that gets me is I look at the backlinks and I wonder at times where they all come from. Seems like the older the site the more links they have, Then G just hammers you because some garbage directories link to you. Build that age factor into the picture and you might have something. The older the site the more tolerance is given to balance the fact that a more directories might have picked them up and linked to them. I see directories that link to you but there is no way to contact them (so why does google not deindex the site?) If they build an age factor in it will hurt spammers and hurt the churn and burn techniques.

Ralph_Slate




msg:4646893
 2:31 am on Feb 20, 2014 (gmt 0)

Backlinks are going to be the foundation of any search engine, because they are a form of crowd sourcing. The only thing that could be better is if Google hired expert curators on various subjects. I'm not sure why they don't, and maybe they do. They wouldn't have to curate each search, but instead look at the most common searches, create a machine list of the top 100 sites, and let curators decide the top 10 or so.

Dymero




msg:4647137
 2:38 pm on Feb 20, 2014 (gmt 0)

If only they could identify who is giving and receiving links, that'd be extremely powerful for them.


Other than identifying certain patterns that might indicate link buying or something like advertorials, or very heavy incidence of keyword-rich anchor text, I don't see how they can possibly do this.

If a link builder can place enough natural looking links within content, can carefully maintaining a good non-rich to rich ratio (which, if we're honest with ourselves, should probably skew more to the non-rich side), and does it at a gradual pace, it'd be virtually impossible to detect, even if money was being exchanged.

Where Google does seem to falter to a great extent, but can improve on, is that pattern recognition. I've seen plenty of examples of obvious link buying that hasn't been picked up by Google.

scooterdude




msg:4647145
 3:31 pm on Feb 20, 2014 (gmt 0)

Do you mean they haven't done this already :)


Backlinks are going to be the foundation of any search engine, because they are a form of crowd sourcing. The only thing that could be better is if Google hired expert curators on various subjects. I'm not sure why they don't, and maybe they do. They wouldn't have to curate each search, but instead look at the most common searches, create a machine list of the top 100 sites, and let curators decide the top 10 or so.

Ralph_Slate




msg:4647149
 3:42 pm on Feb 20, 2014 (gmt 0)

Do you mean they haven't done this already :)


I don't think they have, because I still see a lot of garbage when I search for relatively common things. Maybe they're hiring third-world inexperienced bodies to use their judgment, but it seems like with their billions, they could hire, say, 500 people - an expert or two in 500 different areas, and then those experts would classify and evaluate common search phrases, would examine Google's algorithm results, and would look at the obviously crappy pages

The key is that the people would have to be experts, not average people who aren't familiar with a niche. That may be Google's problem - they may be hiring generalists, who look at how pretty a site is, and ignore how informative or usable it is.

EditorialGuy




msg:4647155
 4:05 pm on Feb 20, 2014 (gmt 0)

The key is that the people would have to be experts, not average people who aren't familiar with a niche.


Trouble is, experts look at a resource (a Web site, book, or whatever) differently than the typical user does, and what they value may not be what the typical user values.

I see this a lot in travel guidebooks:

- Guides written by and for non-locals tend to focus on basics that the typical first-time visitor cares about (such as airport transportation, major tourist attractions, and well-known restaurants).

- In contrast, guides written by and for residents often give short shrift to the tourist basics and emphasize a different set of things (such as off-the-beaten-path attractions, restaurants for the cognoscenti, obscure museums, parks, etc.)

Another example would be recipe books or sites. For the first-time bread baker, processes are the no. 1 priority. For the expert bread baker, unusual techniques or recipes are of greater interest.

In each of these examples, an expert reviewer needs to suppress his or her own biases and focus on the needs of the intended audience. That's often easier said than done. The task is even more difficult when "guidebook" or "recipe book" becomes "Web site" and the audience includes everyone who types "widgetville airport transportation" or "whole wheat bread" into a search box.

Don't get me wrong: Expert knowledge is useful in determining if the information on a site is accurate. But that's a different task than determining whether site A, B, or C is a good result for a specific audience.

Awarn




msg:4647168
 4:28 pm on Feb 20, 2014 (gmt 0)

They could eliminate a lot of sites VERY fast. I tried a removal service and just tested a few sites at first. Absolutely amazing to see the high percentage of non US sites. The other common denominator was the same email being used. All G has to do is look at the contact info on a site and if they see abuse@...(major US regristrar) eliminate the site. Any directory that charges for removals.. eliminate. It is an absolute joke. Google's spam team looks like the 3 stooges when you see the data.

jmccormac




msg:4647202
 5:58 pm on Feb 20, 2014 (gmt 0)

Wonder if this means that the much vaunted 'Knowledge Graph' and AI stuff is a bit of a dead end and there will be more concentration on link analysis?

Regards...jmcc

EditorialGuy




msg:4647240
 7:42 pm on Feb 20, 2014 (gmt 0)

Wonder if this means that the much vaunted 'Knowledge Graph' and AI stuff is a bit of a dead end and there will be more concentration on link analysis?


Couldn't they go hand in hand?

Also, I wonder if Google might not eventually follow the precedent set by Yandex, which is removing links as a ranking factor from commercial results in the Moscow region because of pervasive commercial link spam.

Links were conceived as "citations," and they didn't become "votes" until search engines came along and introduced concepts like link popularity and PageRank.

Does it really make sense to view, say, 5,000 links to debbies-dollar-store-dot-com or arnies-affiliate-site-dot-co-uk as "citations" of anything? For that matter, does it make sense to view 10,000 or 50,000 unsolicited, auto-generated links from Mrwhatis to my information site as "citations"? There are whole categories of links that could simply be thrown out without lowering the quality of Google's search results.

Martin Ice Web




msg:4647255
 8:19 pm on Feb 20, 2014 (gmt 0)

without lowering the quality of Google's search results


ha, yes u are right because it is allready so low it couldn´t be any lower,could it? Yes it can, we will see with next quality update.

aristotle




msg:4647258
 8:35 pm on Feb 20, 2014 (gmt 0)

There are whole categories of links that could simply be thrown out without lowering the quality of Google's search results.

Actually, throwing them out would probably raise the quality of Google's search results, especially if it means that they are no longer used as a basis for Penguin or other penalties.

bumpski




msg:4647259
 8:41 pm on Feb 20, 2014 (gmt 0)

Backlinks are going to be the foundation of any search engine, because they are a form of crowd sourcing.
The "crowd" was completely eliminated when "rel=nofollow" was introduced.

Even before "nofollow", links came primarily from the web aristocracy (webmasters), not the common man. Google's concept of a link "democracy" was farcical.
It worked to some extent before business was done on the web.

EditorialGuy




msg:4647304
 11:47 pm on Feb 20, 2014 (gmt 0)

Even before "nofollow", links came primarily from the web aristocracy (webmasters), not the common man. Google's concept of a link "democracy" was farcical.


I don't think Google's use of links was ever about the "common man" or "link democracy." On the other hand, it wasn't about a "Web aristocracy," either. It was simply about the idea that people writing Web documents (whoever they might be) were more likely to cite good Web pages than bad ones.

It worked to some extent before business was done on the web.


Yes, and it still does in many cases, but it tends to work better in an informational context than a commercial one. If I'm a pathologist writing a paper about the life of tattoo ink in buried bodies, I'll cite useful resources such as other published papers on the topic. Those links are a lot more likely to be genuine citations than links acquired by an SEO firm for a commerce client.

ColourOfSpring




msg:4647306
 11:51 pm on Feb 20, 2014 (gmt 0)

For that matter, does it make sense to view 10,000 or 50,000 unsolicited, auto-generated links from Mrwhatis to my information site as "citations"?


I agree with you, but it's the MrWhatIs-type scraper sites that give small business sites penalties....it's ridiculous. I guess the small business guy just has to be "recognised" by the blogerati and get tons of buzz about his actually-useful-in-the-real-world-but-not-buzz-generating-products. And never mind that people are actually looking for his products and want to buy them because they're useful. Instead he has to make a zany website to be talked about and win citations.

Dymero




msg:4647358
 6:32 am on Feb 21, 2014 (gmt 0)

Wonder if this means that the much vaunted 'Knowledge Graph' and AI stuff is a bit of a dead end and there will be more concentration on link analysis?


The knowledge graph and organic results are tackling Google's mission statement of "organizing the world's information" from different directions.

The knowledge graph collects facts and makes connections between these facts in order to give someone a quick answer. This supposedly will be extended in the future to also actually coming up with answers to questions (the elusive "Star Trek computer") that it doesn't already know.

The organic results meanwhile lead people to answers and other content - like opinions, discussion, and services - that Google can't easily show on its own site or doesn't fit the mission statement.

CaptainSalad2




msg:4647427
 10:47 am on Feb 21, 2014 (gmt 0)

I guess the small business guy just has to be "recognised" by the blogerati and get tons of buzz about his actually-useful-in-the-real-world-but-not-buzz-generating-products. And never mind that people are actually looking for his products and want to buy them because they're useful. Instead he has to make a zany website to be talked about and win citations.


GREAT point, very true! Bloggers ugh!

bumpski




msg:4647435
 11:40 am on Feb 21, 2014 (gmt 0)

I don't think Google's use of links was ever about the "common man" or "link democracy."
This Google article is a hoot! (Ten things we know to be true ..., 200 signals!) In fact, don't tell anyone, I made a copy of this page; Shhhhh (before it disappears...)
[google.com...]
We first wrote these “10 things” when Google was just a few years old. From time to time we revisit this list to see if it still holds true. We hope it does—and you can hold us to that.
Democracy on the web works.
Google search works because it relies on the millions of individuals posting links on websites ....
When were there ever millions of individuals posting links that resulted in actual citations from the individual. I have many pages listed as references, sources, citations, and probably, in many cases, the author creating the actual citation has no idea that the webmaster (corporation) of the site has invalidated that citation with a "rel=nofollow". If a "webmaster" allows links from individuals on his site he should stand by all of them or selectively delete them.
"rel=nofollow" partially invalidates Google's own stated principles.
Matt Cutts: Search Results Look Worse Without Links As Ranking Factor

If Google were following its own standards, "rel=nofollow" should add up to a penalty; wouldn't that turn the world upside down.

ColourOfSpring




msg:4647446
 12:13 pm on Feb 21, 2014 (gmt 0)

bumpski, good point. On the one hand, Google say there are 200 signals and links are but one of those. On the other, they say their rankings are utterly rubbish without that 1 signal.

aakk9999




msg:4647448
 12:17 pm on Feb 21, 2014 (gmt 0)

On the other, they say their rankings are utterly rubbish without that 1 signal

Probably because links bring much more than 1/200 of the signal.

bumpski




msg:4647450
 12:34 pm on Feb 21, 2014 (gmt 0)

Google posted this article around Sep 14, 2009, but the article itself implies the philosophy has been around far longer.
I was sort of joking about the 200 signals mentioned, because now they tout 500 signals, each of which must have a weighting factor.
R & D has probably given Matt 500 dials on his screen that he can twiddle any time he wants, in real time, should he choose.

roshaoar




msg:4647459
 12:55 pm on Feb 21, 2014 (gmt 0)

My hunch would be that Google sees backlinks as a sort of 'push' signal, ie pushing a URL towards a higher ranking on Google. Rough 'citation' if you will. But, rather that immediately ranking it on page 1 and keeping it there forever, when something ranks in Google, Google then looks at the new page 1 results for that query and takes things like overall clickthru rates and customer satisfaction into account, and microadjust accordingly, and continuously.

Don't forget that the target Google (probably) has for itself is to satisfy 100% of Google queries with someone clicking through to a link on page 1. They don't want you leaving Google to find the info elsewhere. They don't want you going onto page 58 of Google, it represents inefficiency and information finding fail. Google wants: 'Google has succeeded in your mission to find information really quickly'. And if there's one set of page 1 links that give a higher internal succeed rate on page 1 than another set of page 1 links, regardless of the SEO-ness or prettiness of the link, their algo will optimise for the higher. This strategy makes sense because it means Google search satisfies the largest possible number of people.

So yeah, links matter a lot as a rankracing signal, but it just gets you to the starting line, after that it's user experience & clickthrough. Kind of as it should be, surely.

I think though that inevitably it'll mean that Google will lean more and more towards 'buy product' results, as that's what most man on the streets use Google for. But that's a shame as well because many people use search engines to research stuff as well and the commercial results do clutter information research. What a shame that there's no version of Google that you can access that lets you remove all commercial 'selling you stuff' results.

goodroi




msg:4647467
 1:36 pm on Feb 21, 2014 (gmt 0)

I also would not assume that links are just one signal to Google. I would guess that the many different aspects of links are each a signal.

-age of link
-link anchor text
-dofollow vs nofollow
-placement of link on the page
-pagerank of the link
-etc

So taking away links could be taking away 10, 50 or 100 signals that Google has incorporated into the algorithm. While link data might be flawed and abused, there are few alternatives that are scalable and better or less prone to abuse.

I guess it is not easy to come up with an algorithm that can handle billions of different queries and deliver fast results to billions of people across the world with results that are relevant enough to keep them from abandoning your search engine for another all while there is an army of spammers trying to attack you and an even bigger army of business owners all demanding that they and they alone deserve the #1 ranking. I don't feel bad for Google but let's not think that this is a simple thing that just anyone could resolve.

brotherhood of LAN




msg:4647569
 7:12 pm on Feb 21, 2014 (gmt 0)

let's not think that this is a simple thing that just anyone could resolve


I think this is often overlooked.

jmccormac




msg:4648154
 7:04 pm on Feb 22, 2014 (gmt 0)

Couldn't they go hand in hand?
I don't think so. This is getting back to the Social Network of Links that I was talking about last year. (I think the posts are in some threads here.) Natural links to sites follow a pattern and linkspam is very different from the natural links pattern that builds up around a website.

Also, I wonder if Google might not eventually follow the precedent set by Yandex, which is removing links as a ranking factor from commercial results in the Moscow region because of pervasive commercial link spam.
Google could solve a lot of the linkspam problems overnight but it would devastate the meatbot industry and all the linkspammer businesses.

There are whole categories of links that could simply be thrown out without lowering the quality of Google's search results.
Again it gets back to the Social Network of Links.

Regards...jmcc

This 32 message thread spans 2 pages: 32 ( [1] 2 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved