Welcome to WebmasterWorld Guest from 54.167.46.29

Message Too Old, No Replies

SEO after Hummingbird... why referrer data has become less relevant

     
10:05 am on Oct 3, 2013 (gmt 0)

Moderator This Forum from US 

WebmasterWorld Administrator robert_charlton is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 11, 2000
posts:11309
votes: 163


It just hit me... after posting a couple of threads about John Mueller's recent Google Hangout, and also reading through various Hummingbird discussions, why, going forward, referrer data would probably not be helpful for serious SEO.

John Mueller's Hangout video is on YouTube here...

Webmaster Central 2013-09-27
trt 1:08:38
https://www.youtube.com/watch?v=R5Jc2twXZlw [youtube.com]

I posted about it with regard to...

Google Authorship
[webmasterworld.com...]

and...

Google's View of Product Descriptions
[webmasterworld.com...]

Particuarly in the latter, I got the sense, from John's comments following the mention of Hummingbird at about 23:50 into the video, that Hummingbird looks at the different aspects of content included not just on a product page, but on a site, and it tries to return sites that satisfy the intent of a wide range of relevant and helpful queries.

If so, an SEO's concern is not so much about what keywords are included on a page, so much as what pages are included on a site... and, beyond that, how helpful the pages and the site are to a broad range of users.

For a while now I've been pushing clients to build sites that anticipate a range of user intent and provide content useful to various kinds of users in different places in the buying cycle and in their engagement with the product... and perhaps even with the product niche.

What Hummingbird is about, I'm coming to understand... or at least I'm guessing from John's comments... is simply that it's better at evaluating the range of useful content that a site might return, and it increasingly will be sending users to such sites.

So it's no longer just a single page and its title satisfying a query... It becomes a whole site satisfying a range of users. With that kind of scope, the individual referrers are both less easy to specify and less determined by the landing page itself. Actually, not so different from what some of us have been preaching.

So, it's up to the SEO to guide in the development of what is a genuinely useful site... not just one that matches keywords. Over the long term, this is going to depart from 2 and 3 word phrases... and the long tail phrases that comprise the "query" will also become increasingly harder to define, as they will perhaps to cease to be just words. Intent will very much be part of the picture (as, eg, mobile search/local purchase vs online purchase). I can't imagine chasing keywords under such a situation.

I also can't imagine that it's going to be very hard to look at a page and describe what keywords the core page should rank for. The issues are going to be with the broader rankings and the site overall.

This is my guess, at any rate. I feel that overall, this will be an improvement in the web, which will become less a collection of content farms and more a collection of pages created with the user genuinely in mind.
1:10 pm on Oct 3, 2013 (gmt 0)

Senior Member

WebmasterWorld Senior Member beedeedubbleu is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Feb 3, 2004
posts: 6099
votes: 6


This is my guess, at any rate. I feel that overall, this will be an improvement in the web, which will become less a collection of content farms and more a collection of pages created with the user genuinely in mind.
But not necessarily in the right order!
2:04 pm on Oct 3, 2013 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Mar 7, 2003
posts: 1048
votes: 0


Should be very interesting if any new tools start to evolve.

Rank checking just got a big shot in the arm.
2:11 pm on Oct 3, 2013 (gmt 0)

Moderator This Forum from GB 

WebmasterWorld Administrator 5+ Year Member Top Contributors Of The Month

joined:Apr 30, 2008
posts:2507
votes: 138


Great post, Robert!

For a while now I've been pushing clients to build sites that anticipate a range of user intent and provide content useful to various kinds of users in different places in the buying cycle and in their engagement with the product...

Absolutely. And the emphasis is on the "useful content" (not just on "unique content").
2:12 pm on Oct 3, 2013 (gmt 0)

Preferred Member

5+ Year Member

joined:Nov 16, 2010
posts:533
votes: 0


Guys, tis a fine theory, but to what extent have they converted this into functioning code ?

conflating a human minds intent with algo capabilities may be stretching,,,,,
2:48 pm on Oct 3, 2013 (gmt 0)

Senior Member

WebmasterWorld Senior Member beedeedubbleu is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Feb 3, 2004
posts: 6099
votes: 6


Personally I think Google have given up on offering the most appropriate results at the top of the SERPs in favour of shafting webmasters and SEOs. :(

For a while now I've been pushing clients to build sites that anticipate a range of user intent and provide content useful to various kinds of users in different places in the buying cycle and in their engagement with the product... and perhaps even with the product niche.
We also have to remember that not all sites have a product or a buying cycle. In actual fact those with no buying cycle and no products are often those that deserve to be at the top of the SERPs.
3:56 pm on Oct 3, 2013 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member

joined:June 14, 2010
posts:985
votes: 0


Particuarly in the latter, I got the sense, from John's comments following the mention of Hummingbird at about 23:50 into the video, that Hummingbird looks at the different aspects of content included not just on a product page, but on a site, and it tries to return sites that satisfy the intent of a wide range of relevant and helpful queries.

If so, an SEO's concern is not so much about what keywords are included on a page, so much as what pages are included on a site... and, beyond that, how helpful the pages and the site are to a broad range of users.

Huh?! Is this a new concept for "SEO's"?!

I have always developed like that since day number one. I've always treated every site as an integrated whole and if someone wanted to present something on their sites that ventured too far away from the core topic I would dissuade them.

The old google keywords tool was valuable in that sense because at the bottom it used to provide "related terms". That was my barometer for gauging what sort of content should or should not be placed into a site.

If SEO's have only been focusing on pages as a stand-alone entity I can now understand why so many have tanked.

I've always approached site development from a perspective of how many of those "related term" queries can I also answer throughout the site as a whole thus giving each single page backup support. I've referred to it as "supporting content" in past posts here throughout WW without elaborating on what I meant by that. And nobody ever asked me what I meant by that, so I assumed everyone was on the same page and understood the concept.

Please, I need to hear from others here that you too have always approached it in this manner and that it's not a new concept!
4:17 pm on Oct 3, 2013 (gmt 0)

Senior Member

WebmasterWorld Senior Member aristotle is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Aug 4, 2008
posts:2678
votes: 94


I've always treated every site as an integrated whole

That's the way anyone would do it if their goal is to create a quality useful website. Surely it hasn't taken Google all these years to realize this.
4:38 pm on Oct 3, 2013 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member

joined:June 14, 2010
posts:985
votes: 0


Surely it hasn't taken Google all these years to realize this.

No I don't think it has. The fact that they have always had those "related keywords" supplemental suggestions there implies them knowing how important it was. Maybe they were hoping this new core can better process integrated wholes now. But so far from what I'm seeing it's not.

I've been trying out google for a few days because of this hype but I'm not seeing anything much different -- same set of lousy results that can't recognize my intention of informational search vs commercial query. Garbage. I got creative and with each new search results I just immediately skipped ahead to page 7 and started browsing from there onward. Pages there were more relevant.

For me google is still very much a waste of time but I have to jump through their hoops for the sake of my clients. If I were only doing this for myself google would have been long gone from my vocabulary. I have to grudgingly keep myself in the loop of happenings.
5:11 pm on Oct 3, 2013 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:June 28, 2013
posts:2367
votes: 208


Matt Cutts did say, a while back, that Google is shifting its focus from "strings" to "things." That might be a good strategy for SEOs and site owners, too.
5:32 pm on Oct 3, 2013 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member

joined:June 14, 2010
posts:985
votes: 0


If they are shifting from strings, which in my thought patterns imply depth of quality, to things, which imply single items to simply satisfy consumerism, then it points to something we've all see coming for a while now. They aren't becoming a knowledge engine (as much as they might like for us to believe, to stoke their egos for them) they are becoming a pay-for-inclusion portal.

No amount of SEO is going to overcome it because they will direct the flow of traffic to their own interests. WSO however still has a chance to maintain some good placings in SERPs.
6:38 pm on Oct 3, 2013 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Mar 7, 2003
posts: 1048
votes: 0


From strings to things! Awesome!

No big surprises there.

So that means they are moving from bits to atoms?

This will be interesting indeed.
7:06 pm on Oct 3, 2013 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member 10+ Year Member

joined:Aug 27, 2003
posts:1597
votes: 0


How would a website owner know if their site "satisfy the intent of a wide range of relevant and helpful queries?"

How will you know what the user intent is? More so now that keyword data is almost gone
7:21 pm on Oct 3, 2013 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:June 28, 2013
posts:2367
votes: 208


If they are shifting from strings, which in my thought patterns imply depth of quality, to things, which imply single items to simply satisfy consumerism


I doubt if Matt Cutts was thinking of e-commerce when he made that statement.

I suspect it was simply another way of saying "topics, not keyword1 keyword2."
8:44 pm on Oct 3, 2013 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Dec 19, 2004
posts:1939
votes: 0


Smart read, thanks Robert. It all makes sense.

In lots of ways it it nice to see as it could finally separate low quality providers of SEO services from the leaders, and dispel so much of the myth and distrust in SEO providers in general.

A nice complimentary side effect is that the things SEOs need to ask their clients are simply optimized branding events, so this becomes much easier for brands to understand and much less smoke and mirrors.

Explaining SEO as a function of optimizing brand is an easy sell and a value-driven one.
10:28 pm on Oct 3, 2013 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member tangor is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 29, 2005
posts:6130
votes: 277


Okay... I grok all that's been said, but query to OP ... where's the "referrer data" part of the subject in all this?

Hummingbird is more of a mach schnell in serp returns (by ignoring/and or returning like previous, etc. ie. common denominator taken from crowd sourcing).

HB also looks at the singular, ie. the "widget" and cares not a fig if there are 32,000 colors of the same offered by a site... that is, the site gets one page to hit instead of the 32,000 pages.

What the sellers of widgets need to do from here on out is have a really outstanding widget page leading to a check out (the site's time, noindex,nofollow,hecknotevenseenbybots) eCart.

Every widget page counts... and all the variations of same won't be seen.

G (and believe it or not B and Y, too) will say this is not dumbing down the web. Look! We have better returns!

Reality is the algo is getting strained, there's just SO MANY sites out there, with more coming on line every MINUTE, that G is terrified to miss anything. They must keep slogging away (including referers). Bottom line is a serp needs to be returned and I'm seeing more simplicity. Instead of widget red, widget blue, widget green, of the old days etc. the new return is "widget", if returned at all.

I think the newby sandbox of a few days after robot discovery will continue, sincefinding "new" is the bread and butter of keeping sellers and publishers alive to fuel the robots (of B, B, and Y, etc.) insatiable hunger... but if that magic moment of top view doesn't take off... hello -400!
10:34 pm on Oct 3, 2013 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Apr 29, 2005
posts:1868
votes: 41


Spot on Robert, it's just amazing that G has taken this long to get there - well, let's hope they have got there.

Why has it taken so long, with all the brains and computing power they have? 15 years to get to this point, it's a staggering period of time in this area of business.
12:27 pm on Oct 4, 2013 (gmt 0)

Junior Member

10+ Year Member

joined:Mar 26, 2005
posts:75
votes: 0


Search for "I forgot that WebmasterWorld does not allow specific keywords" , #2 result of first page is just empty page!

[edited by: goodroi at 12:43 pm (utc) on Oct 4, 2013]
[edit reason] Please no specifics as per forum charter [/edit]

3:27 pm on Oct 4, 2013 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Sept 16, 2009
posts:991
votes: 48


And yet Google's core product offering continues to be specific keywords matched to the specific query, rank checking to the decimal place and full (well, sort of, in principle) keyword disclosure.
11:42 pm on Oct 5, 2013 (gmt 0)

Full Member

10+ Year Member

joined:June 4, 2005
posts: 240
votes: 30


All these theories require immense processing power to implement over the whole web. Such processing power simply does not exist today.

Hummingbird may be some new form of something (can see no proof so far, but give the benefit of the doubt) but it certainly is not what Google spin tries to pass it for.

Much ado about almost nothing, methinks . . . . .

.
6:00 am on Oct 6, 2013 (gmt 0)

Moderator

WebmasterWorld Administrator skibum is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Sept 20, 2000
posts:4469
votes: 1


It is arrogant of Google to think they can understand what someone is thinking. When someone enters a keyword they are telling the search engine what they want most of the time in a very literal sense.

Google seems to want to make it much more complicated than that.

It like someone goes into a store and tells the clerk they would like to see the TV selection. The clerk at the store thinks he can read the guys mind because the customer has a little sweat dripping from his forehead. The clerk tells him where the ice cream truck is around the corner and walks away.

Maybe one day it will be possible to do what Google seems to be trying to do but I'm afraid today is not that day.
2:58 pm on Oct 7, 2013 (gmt 0)

Junior Member

5+ Year Member

joined:June 10, 2010
posts: 122
votes: 0


>All these theories require immense processing power to implement over the whole web. Such processing power simply does not exist today.

they're not using this on the whole web. They're using it on the first world countries first, and slowly rolling it out in the developing countries.

Also the horsepower is getting cheaper and cheaper. Servers with ARM chips can do a lot of work. They are cheap, low power and can be clustered.
3:00 pm on Oct 7, 2013 (gmt 0)

Junior Member

5+ Year Member

joined:June 10, 2010
posts: 122
votes: 0


>Google seems to want to make it much more complicated than that.

They do seem to be on a mission of complexity. I have seen this before in certain companies, where they get this idea in their head, that they can accomplish amazing things. Sometimes it works out, (Amazon) sometimes it doesn't.
3:07 pm on Oct 7, 2013 (gmt 0)

Junior Member

5+ Year Member

joined:June 10, 2010
posts: 122
votes: 0


>going forward, referrer data would probably not be helpful for serious SEO.

I disagree. Keywords, gives us the intent of the searcher. It also goes to user experience. How can we possibly find longtail keywords, without this data?
How can we build Pages that fit the longtail keywords, without this data?
Longtail is about user experience. Google has extremely sophisticated methods to measure user experience, however, they're not sharing those with us.

I can see how people can become frustrated.
Even if you buy everything Google says, and you want to build the greatest website you can, Google is making it more difficult by taking away these tools.
2:23 pm on Oct 8, 2013 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:July 18, 2002
posts:2283
votes: 3


How can we build Pages that fit the longtail keywords, without this data?

The point is that Google no longer wants you to "build" a page for a longtail keyword. It wants you to build a site that answers the intent of a wide range of related potential keyphrases. No site can cover every potential keyphrase on a page by page basis with out serious duplication of information, which is detrimental to a visitor experience.

Google came to the realization a long time ago (I think) that the way their algo was, could not efficiently deal with every potential keyphrase. I think they had already turned their algo to dealing with this, but Hummingbird is just a way to do it more efficiently from a computing standpoint. Which is why it has not appeared to affect much.
2:33 pm on Oct 8, 2013 (gmt 0)

Senior Member

WebmasterWorld Senior Member aristotle is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Aug 4, 2008
posts:2678
votes: 94


Long-tail referrals from Google have always tended to be poorly targeted in comparison to shorter search terms. To some extent, this is probably unavoidable. It might be interesting to see if bounce rates on long-tail referrals improve as a result of Hummingbird. Except that it's hard to separate long-tail traffic from short-tail traffic if referral data isn't being provided.
2:49 pm on Oct 8, 2013 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member

joined:Jan 1, 2011
posts:1358
votes: 18


Long-tail referrals from Google have always tended to be poorly targeted in comparison to shorter search terms.


You know, it's always seemed just the opposite on my sites, so this may depend on other factors.
3:03 pm on Oct 8, 2013 (gmt 0)

Senior Member

WebmasterWorld Senior Member aristotle is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Aug 4, 2008
posts:2678
votes: 94


You know, it's always seemed just the opposite on my sites, so this may depend on other factors.

Do you mean in comparison to terms that you actually tried to target? I was thinking of "accidental" traffic that I see in my logs ( or at least used to see before Google started withholding referral data).
3:19 pm on Oct 8, 2013 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:July 18, 2002
posts:2283
votes: 3


Do you mean in comparison to terms that you actually tried to target?

Not everybody tries to target keyphrases anymore. I have been targeting, I guess your could call it, "key concepts" for years now and have found that long tail traffic has been pretty accurate. Focusing on targeting keyphrases may have caused issues with long tail traffic because Google was confused as to the key concept.

The fact that G has removed the ability to see key phrase referrals and then this update indicates that they would like SEOs to move away from the idea of keyphrases and start thinking along key concepts. Actually, they are forcing SEOs into it now.
4:28 pm on Oct 8, 2013 (gmt 0)

Junior Member

5+ Year Member

joined:June 10, 2010
posts:122
votes: 0


For people who have not read Chris Anderson's book - The Long Tail, I highly recommend it. The success of Amazon and Netflix is because of the long tail. But now its almost impossible to see long tail trends, LT desires, LT user intent.
I have found this extremely valuable in creating content that users want, rather than content I think they will like.
This 34 message thread spans 2 pages: 34
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members