| This 65 message thread spans 3 pages: < < 65 ( 1  3 ) > > || |
|Google & Traffic Shaping - a hidden method to the quality madness?|
| 9:35 pm on Oct 27, 2010 (gmt 0)|
On the Monthly Thread [webmasterworld.com], I've posted about my experience of what appeared to me an extended period testing
|Using a multivariate dataset, across a range of different keyphrases, user intents and user types, Google exposed our site in marginal but significant ways (putting us up one place, dropping Universal search, above or below shopping results, etc). They did this with (at least) four separate sets. |
Over the course of 6 weeks, we experienced a slow churn of referrals, with four discrete datasets:
|BIG uplift in traffic since 1st SEPT (20% above trend) with a corresponding drop in conversion rate, so sales were broadly static (on trend). Referrals shifted at precisely the same time. No visible change in ranking. |
A referral shift on 16th Sept, another 6th October- both traffic and conversion neutral (relative to the 1st Sept).
Then the biggy. 12th October, huge referral shift. Traffic-neutral, but conversions back at pre-Sept level. In other words, we are now 20% up on sales. The referrals are NOT the same (or even particularly similar) to the pre-September level
Does anyone else have any experience of what appears to be purposeful traffic shaping- with a definitive end result?
In the past, I've shied away from any theory that requires a "my site is special" mindset, but I am convinced this was outside the normal algo development cycle. My personal point of view is that Google is aggressively profiling users and sites, and trying to match the two within a specific context. Any takers?
I don't want this to become a "Google has no purpose, all my traffic is going to SPAM sites" free-for-all. Please post with qualitative data, or some meta analysis.
| 5:11 am on Oct 29, 2010 (gmt 0)|
wow that's a good observation guys. I am sure it will help all of us.
| 10:51 am on Oct 29, 2010 (gmt 0)|
|how "informational" and "transactional" might be made more granular |
Well, lets say you are looking to buy (transactional).
Do you want your hand held through the sales process? So "I need a car" is slightly different to "Buy [make-model]", similarly
"I need a mobile phone (cell phone) contract" is different to "iPhone on <netork> contract"
On one, your LOOKING TO BUY, but you need a VALUE ADD sale. On the other, you have MADE YOUR DECISION, and are PRICE SENSITIVE. Regardless of tradtional ranking factors, you need a DIFFERENT TYPE OF SITE. If you need hand-holding, you are likely to feel a page full of prices but little info is rubbish. A price hunter would hate a page with a load of text and a hard-to-spot price / availability.
Similarly, you might serve totally different pages on the final term to the below:
"Remote power" -> "Ethernet power" -> "POE" -> "802.3af"
"802.3" -> "802.3at" -> "802.3af"
Or, how about a subtle destinction:
"ipod" -> "mac" -> "apple"
"google" -> "microsoft" -> "apple"
Now, zooming out, what if you are primarily interested in companies, and not gadgets. "apple" should give you the second version of the above example. Gadget lovers would probably prefer the first. Persistant browsers, or impulse buyers would want the texty ecom, wheras those who just buy what they know they want would want the price-orientated sites every time.
I could give more examples, but this was simple in my head and has filled alot of space on paper.
Your "escape vector" is the type I would strongly advocate- with the addition of link-building from HIGH QUALITY sources in your EXPANDED FOCUS area to DEEP PAGES (but not not bottom tier pages). Really hammer home the message. And don't panic when you see medium-term flux- that's the expected interim result.
| 1:27 pm on Oct 29, 2010 (gmt 0)|
|groups of users with certain patterns in their search histories (en masse), query types that fall into certain patterns and websites of certain patterns |
Yeah, I can see that, whatever the taxonomy: Shopper, Traveler, B2B, Widget Enthusiast, Politcal Gadfly, SEO (here he is again, the same danged keyword searches day after day after day).
| 1:59 pm on Oct 29, 2010 (gmt 0)|
|from freejung's post... 2: Change your user intent category. Convince Google that you're not really an informational site about fuzzy widgets, you're actually a fuzzy widget e-comm site |
This excellent thread has helped me crystalize in my own mind the crucial importance to try to explicitly identify for Google exactly the kind of site I have. I know that seems obvious, but I think that I (and perhaps others as well) have not been clear enough. For example, if I have a site where the true intention is to sell widgets, then I want to make it abundantly clear that I am primarily an online store, because it would be easy to be identified as being mostly informational, so Google (with their intention to give people what they think they want) would be sending visitors that want to read about widgets, but not necessarily buy one. I appreciate the traffic but that would mostly do me no good.
That is not to say that my sites will not remain informational ~ they certainly will because I understand the importance of building "authority" on the subject ~ but I will go through much of my top-of-page wording to clarify the true core intent, so it is easier for Google to get it right ~ to have people purchase something. This effort may or may not do any good, but it seems logical to me that this is how the "new Google" operates as they attempt to read people's intentions, not just their queries.
| 2:53 pm on Oct 29, 2010 (gmt 0)|
|Your "escape vector" is the type I would strongly advocate- with the addition of link-building |
Thanks, Shaddows, that's very encouraging.
I agree about the link-building. I sort of left that out on purpose because of the discussion in the other thread about whether to do link building at all. Certainly if you have the ability to get links from high-quality sources, doing so is almost always a good idea regardless.
Good tip about linking deep but not to the bottom level. That's what I encourage users of my content to do: link to subcategory pages, not individual widget pages or the homepage.
|But I'm kind of stumped as to how "informational" and "transactional" might be made more granular |
There are always lots of ways to slice a cat - that's the great thing about analysis. For example, if you're looking for information, do you want it for entertainment reasons, or just idle curiosity, or are you doing deep academic research, or are you researching a purchase? You would want very different sites in each case -- for entertainment you might want Youtube and lolcats type of stuff, for idle curiosity Wiki might suffice, for academic research you would want .edu sites and for purchase research you would want consumer reports.
In my niche, which I guess would be considered informational, I've already identified several distinct user intents as far as what they want to do with my content: some want to use it on a website or blog, some want to use it for educational purposes (homework), some just want it for personal use on their own computer, some want to use it for commercial print publications, and some just want to look at it and then move on. And that's _after_ having eliminated content geared toward users I'm really not in a position to satisfy, like those who want to buy a high-quality printed copy of the content (I'm thinking I might set that up separately and put it on another domain).
The question is, how is Google slicing the cat? I've been able to get clues about that by looking at the results for subtle variations of the main keywords -- sometimes they return pretty much the same results, sometimes they return a completely different set of sites. I've been able to eliminate one category of synonymous keywords just based on those sites clearly being geared toward a different user intent than my primary focus, which is on people who want to use the content online (after all, those are the ones who link).
| 3:46 pm on Oct 29, 2010 (gmt 0)|
|crucial importance to try to explicitly identify for Google exactly the kind of site I have |
Yes, that's exactly what I was getting at.
"If the world is going to Hell in a bucket, I want to hold the handle." --Peter O'Toole in "Club Paradise"
If Google is going to put you in a bucket, you probably want to decide very firmly and definitively which bucket you want to be in and then figure out how to get there.
| 4:09 pm on Oct 29, 2010 (gmt 0)|
|you probably want to decide very firmly and definitively |
Yes, I take them at their word, that the goal they have set out is to know what a person wants even if that person is not sure. That can be read 3 ways:
~ What arrogance!, or
~ How foolhardy!, or
~ That's a brilliant strategy.
They don't care how I read it, but they've said what they've said and I take them seriously, and therefore if that's their goal, then the one thing they don't want is doubt. Doubt is their enemy, because as likely as not it could bring back the wrong results. So...... remove doubt as best you can, and have a better shot of Google getting it right. Now... how to best do that is the challenge.
| 4:29 pm on Oct 29, 2010 (gmt 0)|
I'm thinking in some cases splitting your site into multiple domains might be the way to go.
In my day job, I work for a mid-sized b2b. When I started, our site was basically a brochure. We've since added a blog and lots of informational content related to our industry, but we still have a lot of the brochure-type content on the same site.
Now I'm thinking, maybe use the main site exclusively for information (those queries have vastly more traffic, our particular niche gets close to zero transactional queries because most of our potential clients either don't know our services exist or already employ us). Put all of the product and company info stuff on a separate domain. Cross-link the domains in the nav so it's transparent to the user.
It's sort of the next level of information architecture: domain siloing, where you put content designed for one user intent on one domain and content for another user intent on another domain, all within what pretty much looks like a single website. I've seen sites in some industries which already appear to be practicing this quite successfully.
| 4:57 pm on Oct 30, 2010 (gmt 0)|
OK, another thought on this: do you think it would be better to split content off into new domains or into new subdomains?
| 5:24 pm on Oct 30, 2010 (gmt 0)|
|I've seen sites in some industries which already appear to be practicing this quite successfully |
Just have a look at the Google itself.
| 7:17 pm on Oct 30, 2010 (gmt 0)|
Im using language Meta Tags for English, you recommend me to delete that tags?
| 7:25 pm on Oct 30, 2010 (gmt 0)|
Technical talk aside the true measurable success for any of these google adaptations can be measured in two ways, user satisifaction and profitability. By default they have done well in the latter, universally I think we all agree in the last six months google's ability to return the correct results in almost all situations has taken a tremendous dive.
IMO the more testing they do, the worse the results are getting.
| 7:38 pm on Oct 30, 2010 (gmt 0)|
I recently worked on a project where the company used to have one website with three distinct target audiences. They were doing OK - but with a lot of ups and downs.
They created three websites, one for each user intention on three different domains. A few months in and it's working well. The visitor segments have a much easier time and Google has it sorted out very nicely, without any roller coaster traffic - in either volume or quality.
| 9:20 pm on Oct 30, 2010 (gmt 0)|
Is there an overlap of products on each site? In our niche as soon as google sees overlap the lesser site gets wiped out.
| 9:20 pm on Oct 30, 2010 (gmt 0)|
I used to be involved in multi-dimensional analysis of time dependant variables based on "user" responses.
Often if you actually relied on the statistical result you missed the "big picture" and often the manual conclusions were based more on gut feel than anything.
Conclusion : you can only profile where the user "fits" a specific statistical conclusion set, or several sets - so I would agree with Shaddows - Google is trying to fit your site within a set of contexts
Proof: how many times have you seen your site appear in searches for which your site pages are not specific aimed at ?
| 9:32 pm on Oct 30, 2010 (gmt 0)|
|Is there an overlap of products |
There's a strong overlap of keywords - but this is more of a service offering than a product offering.
| 8:36 pm on Oct 31, 2010 (gmt 0)|
Quite some time ago, Google introduce a tool to (ostensibly) help you determine conversion.
Clearly they've been interested to know every single aspect of conversion for ages. However, they used it for far more than many might have anticipated.
And now they have absolute control of how they want to shape traffic. And clearly they will tune the knobs until they profit the most out of it.
| 9:04 pm on Oct 31, 2010 (gmt 0)|
Clark you have it right. If they cared about the most accurate organic results the resources that are being spent elsewhere would be discovering the same spam patterns I am in just a few niches. Each day I watch new businesses bounce to first with the addition of 20-40 new php dhdhwgaeki.com backlinks.
We all gave google way more information than we should have and now theyre using it to shape our thoughts and searches.
What all of you are saying here is great but when I can type a two word keyword set with 1.3 million searches a month and the first result is a month old page, a recently bought expired domain that lists another site in the description - doorway page, I know they put the cart before the horse.
They dont today have the quality results of 12 months ago. Before they can funnel our searches they need to secure a firmer foundation
| 9:26 pm on Oct 31, 2010 (gmt 0)|
The problem with caring about "the most accurate" results is that it's a judgment call - always subjective and not quantifiable. So Google cares most about "the most useful" results - what their users respond to.
This has been the long time key to Google's profitability, and the two goals are not necessarily at odds. I see no evidence that Google is chasing profit above user satisfaction. Pleasing the user is still the primary goal.
To be sure, profit is a primary way that Google keeps track of how well they are doing. But did you notice that with the new Place Search, the Google Map floats down on the right as you scroll, obscuring the Adwords? That's not putting profit above all else.
And so it is with Traffic Shaping. This is an approach that "should" increase user satisfaction if Google gets it right: the right classifications, assigned and matched, for the query, the user and the website.
| 10:38 pm on Oct 31, 2010 (gmt 0)|
Should not be pleasing visitors the primary goal of all webmasters?
Google is no different than a regular webmaster. It is a superwebmaster on steroids. A communion of highly trained and highly specialized brains that altogether peform as a webmasterborg. And when all pieces fall into place the #1 ideal search result would be in accord to the objectives of the user, the creator of the page and Google.
What I can't come to terms with is this idea of us having to convince Google about anything.
A decade ago, when first entering this field I learned the 'old school' dictate that visitors would unambiguously be told what the landing page in question was about. Every comma, every dot and every image on the landing site had to clearly communicate the essence of the page and what was the visitor's expected behaviour. Those were the days of expensive bandwidth so quickly "defining" things was called for. Should the response by the user wasn't the expected one, a door was shown. The door would take the form of narrower and narrower filtering to help such user define his/her own intent as we were aware sometimes some users would be vague. Thus, choices were provided.
Choosing the door would mean sending visitors to other sites with other clearly defined objectives, represented by the information on those new sites. And this routing would go on and on and on... forever, if possible. Or until there was an eureka moment which was the ultimate goal.
Reading this highly intelectual discussion I keep coming back to the idea that a webmaster has created a page/site that is somehow cloudy, therefore Google does not know what to make of it and I keep scratching my head how in the world this may have happened. It is like starting things at the end! The other way around.
Are webmasters not to plan a site ahead of launching by defining the #1 goal for such site and then using all resources available to deliberately delineate/shape/influence the behaviour of visitors that happen to come across these pages?
Maybe the web has gotten a lot more complex now and am out of my league :(
| 4:46 pm on Nov 1, 2010 (gmt 0)|
|They created three websites, one for each user intention on three different domains. |
Thanks, tedster, that's the sort of info I was looking for. The company I work for in my day job has a similar situation in terms of having at least two, possibly as many as 4 distinct target audiences we are currently trying to serve with only one domain. I think I'll see if I can convince them to launch some more domains and split the content up. We have two quite distinct product lines, and within those probably two different intents: informational and transactional. So I'm thinking 4 domains with lots of navigational interlinking may well be the way to go.
| 5:24 pm on Nov 1, 2010 (gmt 0)|
|did you notice that with the new Place Search, the Google Map floats down on the right as you scroll, obscuring the Adwords? That's not putting profit above all else. |
I don't agree. Marketing is part of profit too. If you convince people that you care more about quality than profit, people will use your search engine because they perceive it to be better. That's why they do little things like cover up their ads and throw you a bone here or there. It's brainwashing people into believe Google is a "different" kind of company. Heart in the right place and all that.
Don't forget, they had a designer quit because he had to defend how many pixels wide he wanted to make a border and back it up with data.
When they cover up their ads, they analyze the effects. If they measure border width, you can bet they will measure how traffic shaping will profit them most...if a better user experience is a side effect, so be it. But that's not the goal (imo).
| 5:41 pm on Nov 1, 2010 (gmt 0)|
I'd like to refocus this thread on Traffic Shaping. I'm so happy that Shaddows shared his data and his analysis. The idea helps understand many ranking mysteries webmasters have been reporting - mysteries that don't make sense if we only think Google uses on algo, one ranking method, for all queries and all websites.
Here's my summary of the Traffic Shaping idea:
Google has a number of classifications (taxonomies, profiles) in several areas:
1. user histories
3. query terms
These categories are dynamically created (so they may be revised and refined) and also dynamically assigned. There appears to be a way of testing websites as to whether they are in the correct bucket or not. And sometimes Google tests this profile assignment - shifting it around over short periods, which can wreak havoc with traffic quality and/or volume.
If I were to guess what this means for SEO in a practical sense, I'd say:
1. Make your signals as clear as you can on every page, and avoid having one page (or site) be double focused.
2. If some kind of yo-yo starts up in your case, don't panic and don't make changes willy-nilly, but study how things are when they are at their best. Then if things settle out at their worst, you have an idea of what kind of profile Google thought you were. If that really is what you intend to be, go back to #1
| 5:12 am on Nov 2, 2010 (gmt 0)|
Here's a thought - All of our SEO and any non-original content is simply ignored. Google seems to be sectional targeting and applying off-site influences on the sections it deems unique. I have a feeling we're ranking only portions of pages lately. To prove that I've found pages that when I quote a section in Google search it returns nothing for my site but a different section ON THE SAME PAGE does return my site.
If you have a product feed for example from a popular retailer and you use it verbatim I guarantee you will not receive any traffic unless you offer more regardless of your SEO efforts.
More may include a review or images by the site owner, comments and 3rd party added content doesn't get full value.
Outside influences may range from your social net imprint to traditional incoming links to you as a profiled webmaster, probably all of the above.
For 2011 unique content may be the only way to go and expecting that to come from visitor comments and such may not be all you need to do. Also, it seems we are bound by expectations. Our sites are expected to attract a set number of people that have x profile associated with them. If we suddenly attract more, or less, a dampening factor feels like it kicks in. Watch your non-paid traffic levels when you start up your next major ad campaign to see what I mean.
Changes that increase repeat visits while lowering bounce rates are raising traffic levels on my sites nicely right now.
Fact: Google wants to rank unique content equally/fairly regardless of on page SEO. Google also places little trust in webmaster supplied information. I think they now have the technology to more closely accomplish those goals rendering SEO efforts less effective.
The big 5 in my opinion, and perhaps the only 5 factors that matter moving forward, are.
Unique and meaningful content
Page code (well formed, cross browser compatible, efficient)
Visitor behavior (relatively new to SEO)
I think Google is still working out the Visitor Behavior aspect of SEO because not all types of sites can be expected to receive the same reaction. Ask yourself this - Do MY visitors click on more pages and talk about my pages in their social profiles (and return more often, and leave more comments on etc) on MY site when compared to the sites Google deems related to mine? If not, anything you can do to fix that is actually SEO. With Google profiling visitors it's more important than ever.
| 1:59 pm on Nov 2, 2010 (gmt 0)|
|I think Google is still working out the Visitor Behavior aspect of SEO because not all types of sites can be expected to receive the same reaction. |
They have a lot of work to do. Here's the dilemma that I see: My purpose of search and their intentions/goals are no longer compatible, and I wonder how many other people are starting to feel the same way. I probably search less than most members here, but still do my share, and G is becoming more & more frustrating. Of course I try to relate my experiences to the "average" websurfer, and I look at my stats, and I scratch my head because I think I meet all 5 of Sarge's essentials and yet my numbers seem to be stuck within a +/- 5% zone, and my conver$ions are down.
Are conversions down because of the kind of traffic G is now sending? Is it because of a fragile economy and low consumer confidence? Don't know, but traffic is mostly the same (throttling?) and money is consistently lower. So what to do?
Back during Mayday my main site took a dive off the cliff in early June. I had already scheduled a camping trip to New Mexico & Idaho so I left for a month and didn't touch a thing on my sites ~ when I returned, all the traffic numbers had returned to their pre-Mayday levels. Which makes me think that sometimes the best thing to do is to do nothing and let it sort itself out... at least for a while.
| 2:48 pm on Nov 2, 2010 (gmt 0)|
|They have a lot of work to do... sometimes the best thing to do is to do nothing and let it sort itself out |
In a way the second part follows logically from the first. It's pretty clear that Google is in the testing and development phases of significant changes in the way they handle user behavior, taxonomy, and profiling. There also seems to be pretty broad agreement that they haven't got it quite right yet, indeed that they may well be making things worse at the moment. However, that doesn't mean they're going to stop -- you don't get to work for Google by giving up easily.
So assume that there will continue to be major changes until they achieve a significantly higher level of quality. Therefore, any results you see in the short term from changes you make now may not be predictive of future outcomes.
Therefore, either do nothing, or better yet do things which are likely to be a good idea regardless of Google changes. Sgt_Kickaxe's suggestion of making changes that increase repeat visitors and lower bounce rate is a perfect example. That's always a good idea regardless of search engine behavior. The steps I outlined earlier for my situation mostly fall into that category as well.
If Google is performing the kind of multivariate testing we're talking about here, sending large groups of profiled users to different categories of site to test the outcomes, presumably they are doing this with the goal of improving the quality of results. Some might suggest that they are simply trying to maximize short-term profit, but I think they're smarter than that. To maintain their edge long-term, they have to take user satisfaction with search to another level. If that's the case, then as a webmaster you want to keep your eye on your long-term goals as well. What is your site _really_ about? How does it provide value to the searcher? How can you improve on that?
| 3:25 pm on Nov 2, 2010 (gmt 0)|
Re testing outcomes....how do they measure outcomes?
In our business almost every order has to be 3 bid. There are not a lot of instant sales. If they lump us in with regular b2bs it's not good.
| 4:43 pm on Nov 2, 2010 (gmt 0)|
Just one suggestion: have a "quote me" button where others might have a "buy" button. Of course, how they monitor that is up for debate- but they do have Chrome, Android and the venerable Toolbar, not to mention analytics and their Ad network.
|In our business almost every order has to be 3 bid. There are not a lot of instant sales. If they lump us in with regular b2bs it's not good. |
Also, I think this is another example of the granularity Tedster was seeking- b2b that results in a quote, rather than a sale.
| 1:28 am on Nov 3, 2010 (gmt 0)|
|There appears to be a way of testing websites as to whether they are in the correct bucket or not. |
This must be why I'm having a problem ranking for new themes - only old themes/pages rank high--the others get hardly any traffic even though I have multiple pages on the new themes.
| 6:42 pm on Nov 23, 2010 (gmt 0)|
This thread is decaying too fast for its importance, IMHO!
|The big 5 in my opinion, and perhaps the only 5 factors that matter moving forward, are. |
Unique and meaningful content
Page code (well formed, cross browser compatible, efficient)
Visitor behavior (relatively new to SEO)
Agreed in most parts, but the basal factors like "alt tags" or overall keyword density still count and should just be covered - maybe not even be mentioned any more, but just covered. Does not hurt!
More related to "Visitor behavior", which is a bold statement without testing - is that discussion in the paid membership area:
about promoting the site with tweets only! That IMHO correlates to the behavioral aspect a lot (even if its off-page).
On a site note: We had a longer discussion about the "traffic shaping" topic (another diamond hidden here!) that Shaddows mentioned as
and tedster did, too:
|And don't panic when you see medium-term flux- that's the expected interim result |
I see the same effects on a pretty solide scale of currently 150k uniques... traffic keeps the level, but the conversions slinger up and down! That is one indication!
| 9:17 pm on Nov 23, 2010 (gmt 0)|
WRT "The Big Five" factors - overlooked there is an increasingly important SE factor: semantic clarity. When your content is sending clear semantic signals, then there's little need for Google or anyone else to do dynamic testing to find the "right" audience, the "right" query intention and so on.
It is easier to achieve semantic clarity with an all div layout - tables tend to juxtapose content in the source code that comes from various parts of the visible page, for example. Does this mean I only work with table-less layouts? Hardly, it's often not practical with an enterprise level CMS.
Still, optimizing semantic and relevance signals in the content area is an important factor in avoiding the dynamic testing that may otherwise be your site's fate.
Many sites (I dare say most site) do just fine and don't experience the kind of fluctuations that this thread is focusing on. But it is an area I'd double check if you find yourself on a severe conversion roller coaster.
| This 65 message thread spans 3 pages: < < 65 ( 1  3 ) > > |