| This 34 message thread spans 2 pages: < < 34 ( 1  ) || |
|SEO after Hummingbird... why referrer data has become less relevant|
It just hit me... after posting a couple of threads about John Mueller's recent Google Hangout, and also reading through various Hummingbird discussions, why, going forward, referrer data would probably not be helpful for serious SEO.
John Mueller's Hangout video is on YouTube here...
Webmaster Central 2013-09-27
I posted about it with regard to...
Google's View of Product Descriptions
Particuarly in the latter, I got the sense, from John's comments following the mention of Hummingbird at about 23:50 into the video, that Hummingbird looks at the different aspects of content included not just on a product page, but on a site, and it tries to return sites that satisfy the intent of a wide range of relevant and helpful queries.
If so, an SEO's concern is not so much about what keywords are included on a page, so much as what pages are included on a site... and, beyond that, how helpful the pages and the site are to a broad range of users.
For a while now I've been pushing clients to build sites that anticipate a range of user intent and provide content useful to various kinds of users in different places in the buying cycle and in their engagement with the product... and perhaps even with the product niche.
What Hummingbird is about, I'm coming to understand... or at least I'm guessing from John's comments... is simply that it's better at evaluating the range of useful content that a site might return, and it increasingly will be sending users to such sites.
So it's no longer just a single page and its title satisfying a query... It becomes a whole site satisfying a range of users. With that kind of scope, the individual referrers are both less easy to specify and less determined by the landing page itself. Actually, not so different from what some of us have been preaching.
So, it's up to the SEO to guide in the development of what is a genuinely useful site... not just one that matches keywords. Over the long term, this is going to depart from 2 and 3 word phrases... and the long tail phrases that comprise the "query" will also become increasingly harder to define, as they will perhaps to cease to be just words. Intent will very much be part of the picture (as, eg, mobile search/local purchase vs online purchase). I can't imagine chasing keywords under such a situation.
I also can't imagine that it's going to be very hard to look at a page and describe what keywords the core page should rank for. The issues are going to be with the broader rankings and the site overall.
This is my guess, at any rate. I feel that overall, this will be an improvement in the web, which will become less a collection of content farms and more a collection of pages created with the user genuinely in mind.
|But now its almost impossible to see long tail trends, LT desires, LT user intent. |
There are other ways to do that than looking at your logs. I personally use 3 different methods that have nothing to do with log walking.
For example, talking to your visitor will do a world of wonders for finding LT concepts, mostly because they will be concepts/keyprases you never would have found through your own logs. And social media provides an excellent venue for doing this.
Aristotle, what hannamyluv said in response to you is pretty much exactly my answer too. I target not "keyphrases I think Google will send me traffic for" but rather keyphrases representing stuff I believe people are talking about in social media, amongst friends, in magazines, on the news, whatever. I assume search will follow, and generally that's worked for me.
But I find Google is more likely to send me mismatched traffic on a short phrase than a long one, and that's what I was referring to. So maybe it depends whether you're targeting keyphrases for which there's already established Google traffic, or not.
|Not everybody tries to target keyphrases anymore. I have been targeting, I guess your could call it, "key concepts" for years now and have found that long tail traffic has been pretty accurate. Focusing on targeting keyphrases may have caused issues with long tail traffic because Google was confused as to the key concept. |
Somebody else gets it!
I've been targeting key concepts (or themes) since 2003. I started doing this after observing that related pages strongly supported each other. So I set about crafting groups of pages for each concept and have never looked back.
No reason to run keyword ranking reports. No reason to look at keyword referrer data. All I have to do is look at my traffic and I know what's performing or not. (And I'm actually getting a bit lazier in my semi-retirement, some months just looking at where my affiliate sales or AdSense clicks came from.)
Of course, you have to know your sites intimately to do this; if your the type of person who doesn't know how many pages are on a site, or wants to create content fast, or wants to do anything in this business easily, it's not for you.
For client work it's a bit more difficult to get the buy in as they've all heard about keywords and buying links and white on white text, blah, blah. But the ones that get it, really do get it. It usually starts with a client who can look long term (the one client I kept after a medical event a couple of years ago was my first, from '97), and can think a bit unconventionally as long as long as the strategy shows results (a second client I picked up about 8 months ago because I was bored and he was tired of his old SEO company).
And you can extend the idea of related concepts: I have one site where I use a kind of "super concept" and several related sub concepts. Works like a charm.
Oh, and my sites wouldn't survive without targeted long tail traffic.
For related concepts - products actually - go to you local bookstore store and pick up a couple of magazines on your topic and check out the advertising. In many cases you'll find ads for other topics chasing the same demo.
From "strings to things" is part of the natural evolution that began with Google's acquisition of Metaweb and Freebase (now Knowledge Graph). Understanding "things" or entities is much more complex, obviously, but it makes sense for the direction that Google sees search moving in.
If you think about it, it's much more efficient for Google to be able to isolate a central entity, and then offer attributes about that entity as navigation options.
This is especially true given Google's heavy focus on voice search. Recognizing specific keywords within a query can be *extremely difficult*, especially on mobile devices and their noisy environments. So it's easier for them to abstract a query to the central entity, and then offer navigation options based on what they know about that entity.
| This 34 message thread spans 2 pages: < < 34 ( 1  ) |