| 6:54 pm on Sep 7, 2012 (gmt 0)|
Is it always the same wrong page or set of wrong pages, or is it pretty much random?
| 11:26 pm on Sep 7, 2012 (gmt 0)|
Pretty much random. Pages are all loosely related just not always the best converting or most relevant. It's as if google is trying too hard and guessing wrong. Or maybe they are just trying to find out which pages on a site are the best for the query when there are multiple possible pages by a/b testing.
| 3:42 am on Sep 8, 2012 (gmt 0)|
|maybe they are just trying to find out which pages on a site are the best for the query when there are multiple possible pages by a/b testing |
Given the preference Google has for machine learning, I'd say that your idea here has a high chance of being accurate. Now the question I'd ask is why the choice they are being given seems so unclear to them - or how can an affected site clarify the ideal page.
Are others who are seeing zombie traffic also seeing this pattern - when conversions go down the landing page Google is ranking is different?
| 11:57 am on Sep 8, 2012 (gmt 0)|
|how can an affected site clarify the ideal page |
Without trying to sound like a smarta**, adwords.
I just ran a report in WMT,
Keyword: "Example Florida"
Avg position: 1.1
Number of different pages shown over last 30 days: 22
Best (ecom converting) page / most often shown page: example.com/florida
why would they do this?
1. Geolocation. A good percentage of the "other" pages being shown are drill down pages based on location, for example: example.com/florida/broward/how-to-use-an-example-in-broward-county.html. So the search was for "example florida", where did the "broward" come from? 1. geolocation based on searcher's ip. 2. personalized search (searcher previously searched for "broward").
2. Do - Know - Go. We know that google is doing a lot with respect to user intention. This search, "example florida" can be Do or Know, (ecom or informational). However, with a short-tail keyword like this I have to imagine it's going to be difficult to determine if the searcher's intentions are Do or Know. (should be no reason for this query to be put in the Go (navigational) bucket and I see no indication that google has ever done so) Most of the "other" pages being shown would fall into the "Know" bucket, informational pages that do not convert very well.
|why the choice they are being given seems so unclear to them |
I choose a query that returns a page from our site in the 1.1 (avg) position because at this level I don't think the varying pages are an indication of signals on the website, more an indication of variables that google is adding to the searcher's query.
| 12:44 pm on Sep 8, 2012 (gmt 0)|
It looks/feels a bit like there is some stream processing going on, of course going down that road of thinking would violate the strict "delineation between adwords and organic" mindset and assume that google is indeed tweaking their system for profit.
Say you take x amount of revenue queries and return results from index "a" until revenue targets are met (this could literally be on a sub-second basis), when target is hit send those same/similar queries to index "b". The possibilities are endless with this but would certainly explain what appears to be "random", you'd really have to look at this from a revenue shaping standpoint vs. traffic shaping.
| 2:30 pm on Sep 9, 2012 (gmt 0)|
Just when you think you're out of the woods and on a good run, they shut you down for a few days. Then it's back on a tear with normal, regularly spaced conversions, then bam! Off again. The past week was doing fine until mid Friday, then the usually busy Saturday had the sales on the hour in the morning, then completely shut down for the rest of the day. Sunday appears to be more of the same throttling. Kinda backs up what TS says above, but the theory can never be confirmed.
| 8:46 pm on Sep 9, 2012 (gmt 0)|
backdraft7 - As tedster noted, traffic reports are likely to be less helpful than some analysis that characterizes your site and your situation. It's been so long that I've forgotten... are your product categories targeted for localized results, in the way, say, that timwilliams suggests that his are?
Additionally, are others getting throttled only on localized searches, or does this extend to searches that may not be localized at all?
Are any experiencing throttling in locations where placenames are ambiguous and there's likely to be confusion? This might apply either when you've targeted such locations, or might simply happen to be located in such a place. Please refer back to my May 2, 2012 thoughts about possible geo-confusion (or testing?) between the US and Australia in this thread...
Google Updates and SERP Changes - May 2012
Again, I'm thinking that Google may be testing more than simply geo factors at this time.
| 1:18 am on Sep 10, 2012 (gmt 0)|
RC - I sure wish I could draw some common denominator in to explain the patterns. They seem to be the result of either endless tinkering with the algo (which makes little sense, because pre 2010 we never experienced this as they were endlessly tinkering with algos) or a new randomization system.
For now I'm chalking it up to the latter, a new random everflux, designed to keep us guessing, forever.
In a few hours, we'll probably see many similar reports from our UK friends concurring the low quality traffic results from the past two days.
| 3:17 am on Sep 10, 2012 (gmt 0)|
backdraft, it seems to me that there must be some factor that changes when your traffic turns "zombie". What have you checked?
| 6:57 am on Sep 10, 2012 (gmt 0)|
@tedster - when traffic levels remain steady and serps are pretty much unchanged, yet traffic quality suddenly switches from "good" to "bad", what can you check? It's all US traffic, not suddenly foreign. The only thing I can see that is changing is the bounce rate, which went from about 42% to 57%. Why? Only Google knows.
I know sales conversion rate probably doesn't help, but when it was like clockwork on the same traffic levels between the years of 1999 and 2010, then slowly evolved into the ON / OFF periods we regularly see today, it just confirms some sort of manipulation. How or why they are doing it is a mystery.
| 11:44 am on Sep 10, 2012 (gmt 0)|
At some stage the concept of traffic washing may well become easily understandable
Consider: what if you're still ranking for the majority of terms you usually rank for, getting the majority of the same traffic you usually get, alas , a certain minority of your usual traffic is , well, not presently visiting you
Taking a wild guess, you've been diligently using that convenient free analytics conversion tool to measure an identify the 'buyers' for a while, if not you, then, the majority of your website peer group probably is, so I not sure I understand why its difficult to see how this happens
As for the other, any one checking out the use of the word ecosystem, or perhaps zoo, or farm would be more accurate
| 12:34 pm on Sep 10, 2012 (gmt 0)|
@backdraft7, I'm curious, in your WMT, select one of your better keywords and drill down to see how many different pages in the last month from your site were shown for the keyword (traffic > search queries > query). I'll bet google didn't show the same page every time the key word was searched for over the last month did they?
| 2:23 pm on Sep 10, 2012 (gmt 0)|
Tim - I did check that and yes, there are at worst two or three other pages, but one leads by a wide margin over the others and those have small visit numbers, so it doesn't explain much. Looks like they are A/B testing some pages (expected), but not enough traffic impact to explain why traffic goes so cold so quickly.
|Taking a wild guess, you've been diligently using that convenient free analytics conversion tool to measure an identify the 'buyers' for a while, if not you, then, the majority of your website peer group probably is, so I not sure I understand why its difficult to see how this happens |
Probably because I just use it for it's surface features and no longer trust Google enough to set goals or dig much deeper into GA. That or (it's ok to say it) as inferred, I'm just stupid.
| 2:39 pm on Sep 10, 2012 (gmt 0)|
And those other pages, how do they convert when they get traffic? As good as the normally shown page?
Now apply that across your entire site, all keywords. If google flips a switch to A/B test different pages for all your keywords could that cause traffic to appear to suddenly go cold, not convert?
| 3:03 pm on Sep 10, 2012 (gmt 0)|
Consider keeping your stats to yourself, a few alternatives out there, an if the rest of your peer group follows, perhaps as people change their querying habits over time, it won't be so easy for traffic to become zombie in nature,
| 3:08 pm on Sep 10, 2012 (gmt 0)|
Tim - for the most part they did quite poorly. 100% bounce on most.
One (#3 candidate) had a very low bounce rate of 16%, while the #1 and #2 trials were in the 36% to 38% bounce rate range. #3 only had a few dozen hits. #1 was in the thousands, so it obviously dominated the serps.
For the most part, the #1 positions outweigh the others by a huge margin, so this A/B test wouldn't exactly explain the "days long" dead zones. This morning I'm getting emails already from other business owners reporting the same poor sales over the weekend and continuing today. It's not just me I guess.
My theory goes like this:
Imagine a forest with nice neat little trails interconnecting known destinations. That's the OLD way Google worked. Nice, reliable paths to finding what you need and a great user experience since you never got lost.
Today, Google replants the trails and blazes new ones every few days. The effect is just like a grocery store moving all their products around so the old regular customers now have to chase everywhere to find what they are really looking for...and making many clean misses along the way.
It's sad, but I'm more inclined each day to believe this is just their way to increase adwords income. Nothing more. They seem to care little of user experience.
| 10:23 pm on Sep 10, 2012 (gmt 0)|
Sometimes G.A. seems to mask what's going on with a site, or I just don't use it well.
G.A. was showing around 18% Safari use which seemed a bit on the high side so I removed G.A. on six "money" pages and replaced it with a different analytics program. Safari use on those pages runs between 20.4 and 31.5 percent during the day with a small percentage at 320 x 480 resolution. The main offer on those pages barely shows on the right side of the screen at that resolution. The low resolution users seem to often group together and come in waves, I have yet to determine if that is a search engine phenomenon or after work mobile use.
There are also significantly more returning visitors utilizing the free resources on those six pages than there are on the site as a whole.
Both situations result in visitors which aren't likely to lead to an immediate "sale".
| 12:46 am on Sep 11, 2012 (gmt 0)|
backdraft-my sales went bye-bye over the weekend as well and have not returned. Traffic and the like are the same just no conversions. I, too, feel like there is some "switch" that gets flipped. We can go all day w/o an order, then get all of our orders for the day within an hour and just like that it's over again.
We were on pace for our best year ever BUT if this keeps up...
| 6:25 am on Sep 11, 2012 (gmt 0)|
|backdraft-my sales went bye-bye over the weekend as well and have not returned. Traffic and the like are the same just no conversions. I, too, feel like there is some "switch" that gets flipped. We can go all day w/o an order, then get all of our orders for the day within an hour and just like that it's over again. |
We were on pace for our best year ever BUT if this keeps up...
Same here, 100% identical! WTF exactly is Google playing at? One thing's for sure, the wider economy will notice this pretty soon....
| 2:58 pm on Sep 11, 2012 (gmt 0)|
Please - this thread is for analysis, and simply reporting on your conversions is not analysis! Let's dig into the data and get a handle on what's happening.
| 3:32 pm on Sep 11, 2012 (gmt 0)|
As I've stated in the past, the more you dig into the data, the less you see. It's really frustrating.
Our site is extremely top-heavy when it comes to traffic numbers- a small percentage of pages get the vast, vast majority of traffic.
In fictional numbers, say the lower-performing 80% of pages got 10-20 unique visitors per day, even in a bout of Zombie Traffic, they would still be getting 10-20 per day. But instead of some being up and some being down, the tendency would be towards the top end. Headline traffic would be up 10-20%, but any given page would be well within normal parameters.
I get the impression we're not typical of Zombie victims, so I'm not sure how that fits into the overall picture.
I've decided that analysis is not helping, so I've stopped worrying. It hasn't affected our bottom line, and it has taken up a lot of my number-crunching time over the months and years.
Currently, I'm thinking it is fairly benign. Google switches off a bunch of filters, or the Intent engine is turned off, or personalisation is dialed down. It always happens when the SEO chatter is about ecom disruption, so I'm fairly sure it's related to algo module releases.
We're not particularly Geo-targetted so we don't notice that filter like many do. However, we do rank very well for a mix of info and ecom terms- I can only imagine an Intent-Type that we haven't thought of has our pages supressed from their SERPs, but suddenly we get shown to them. They're not impressed and leave.
Most likely, the people who see quality traffic disappear are highly targetted in some way. Google could very well be hiding (normally) a bunch of irrelevant sites from your target customers. When the filters get switched off, all the irrelevant stuff starts competing again, losing you the good traffic. Meanwhile, searches you could rank for but don't (because G knows your not a good match for that particular person on that particular occasion) start returning your site, resulting in untargetted traffic. Geo-locatation would be a prime candidate for that.
In any case, unless I have a eureka moment, or someone has a truly inspired insight, I'm giving up on tracing this phenomenon.
Good luck to those materially affected by this issue. I hope you figure it out.
| 3:52 pm on Sep 11, 2012 (gmt 0)|
Had a quick flash of inspiration on the "Intent-Type" we have not thought of. It's one we have thought of actually, but it fits the bill.
Price hunters. We don't cater to the bargain-seeking crowd. If we got a rush of them, they would take one look at the price and scarper. I suppose it would not take much for Google to keep such people away from us under normal circumstances.
Again, not sure how helpful that will be to the rest of you, but its my best guess for my own situation.
There's a parallel thread happening, all about the Query Intent problem I've outlined. I'm just going to quote tedster as this is EXACTLY what I'm talking about:
|The more thought I've put into it the more hurdles I find the would make it difficult to come up with pure data to determine searcher intent from this side of the search engine. |
I've come to the same conclusion. It takes an immense data-mining effort to do this, with a lot more data than one - or a hundred or even many thousand websites can accumulate.
One rough breakdown of user intent is informational, navigational, transactional, with the question of local intention (or not) folded in to various degrees. Google's been at work on this project for many years and they still don't get it right all the time. I also think that Google's breakdown is much more granular (and possibly personalized) than this top level taxonomy.
One interesting facet of Google's work with user intent is that it also requires automated taxonomies to be created for various web pages and websites. So Google's job requires classifying the various queries, classifying the websites, and then matching them. In many cases, queries also seem to need a mixed set of results, at least for now.
The only practical steps I have been able to come up with are in the keyword research I do for a website. Before I even think about targeting a new keyword, I always check the Google results (including the suggestions!) to get a feeling for how that phrase and its relatives are currently classified.
If Google sees it as having one intent but the website is of a different nature, then that keyword phrase may be a lost cause for a given website. And the real kicker can come if Google re-classifies either a site or the query phrase itself. A healthy flow of search traffic can all but dry up overnight
Determining Searcher Intent [webmasterworld.com]
| 4:39 pm on Sep 11, 2012 (gmt 0)|
|Please - this thread is for analysis, and simply reporting on your conversions is not analysis! Let's dig into the data and get a handle on what's happening. |
I'm sure this is pointed at me, but like I've said before, conversions are the only solid indicator we can measure. As to what causes traffic to be "poor quality" or "non converting", what metric do you suggest measuring? We're not see an abnormal number of foreign visitors, and even if we did, we sell globally. As far as I can see, bounce rate is one key indicator, since it implies that the visitor did not find what they were looking for.
How about rather that suggesting "get out the spread sheet and analyze the data" we put together a step by step list of how to use GA and specific metrics to determine why traffic converts reliably for days, then shuts off for days. I personally think the answer is hidden in their system in a way that makes diagnosis next to impossible.
| 4:49 pm on Sep 11, 2012 (gmt 0)|
|I'm sure this is pointed at me |
Not sure it was specifically you, but the issue is that talking about conversions is just saying "it's happening again".
Saying something like
" My top 3 terms have dropped by 50%, while I am getting traffic on search terms I've never targetted and don't have a clue why I'm ranking for it"
That's got some meat. It's not helpful in itself, but it is applicable to other sites.
| 5:06 pm on Sep 11, 2012 (gmt 0)|
Guys, this is extremely simple, but it appears that it needs to be put bluntly.
You are getting almost the same traffic , except that you do not rank for those people who will actually buy except during those periods when you traffic actually converts.
Analytics data properly analysed has been known to identify these people !
And even though you may have no idea, is it beyond you to get it that perhaps another group of folk who have access to your data do geddit!
This is important because it maddens me when I hear folk continue to talk happily about using that free but not private analytics program from a powerful corporate competitor to everybody, to measure anything !
| 5:43 pm on Sep 11, 2012 (gmt 0)|
|except that you do not rank for those people who will actually buy except during those periods when you traffic actually converts. |
And what metric exactly will show us "buyer intent"? I thought Google was the only one to have the mind reading power know that, and they aren't sharing.
In my situation, the traffic looks the same no matter if they are buying or not.
Whatever the cause, it must be something so drastic as to drive visitors off my site. The only thing I could imagine to do that is:
1. Simple Non-Targeted traffic ...nothing I can really measure here, except bounce rate.
2. Page load is hobbled by Adsense code - this has been reported by a few testers, but only sporadically.
3. The loss of long tail and semantic search. This is a conversion killer.
I still say it's just boils down to Google being "broken".
Any conversion reports from me were to simply document the frequency of the issue. If they are unwanted or useless to the diagnosis, perhaps you're in the wrong thread. One thing is for sure, I'll keep that data under my hat from now on. Have fun diagnosing and chasing your tails!
| 5:57 pm on Sep 11, 2012 (gmt 0)|
One thing is for sure, I'll keep that data under my hat from now on. Have fun diagnosing and chasing your tails!
So which analytics package will be on your site tomorrow, next Monday, next month :)
I ask because I am certain you still don't geddit :)
| 6:42 pm on Sep 11, 2012 (gmt 0)|
Scooterdude, we do get it. I mentioned it myself earlier in the thread and others have mentioned it all over the forum. I personally don't use GA, and I still have seen what appears to be traffic shaping or something at times. So with all due respect to your point, which I totally agree with in general ("don't give Google free data"), I don't think it really applies to this problem.
| 6:52 pm on Sep 11, 2012 (gmt 0)|
I don't think it really applies to this problem.
Thanks for proving that you don't geddit.
| 7:21 pm on Sep 11, 2012 (gmt 0)|
Doesn't seem like this is going anywhere.
| 7:33 pm on Sep 11, 2012 (gmt 0)|
Just as a point of information, I dropped GA several months ago and I still get very clear periods of zombie traffic. (Although my overall conversion rate is slowly climbing. YMMV)
The biggest problem I have in analyzing these periods is that Google seems to be very careful to turn them on/off at odd periods of the day. Since the granularity of most analytics software is one day, it's hard (without custom software) to split out these periods to look for differences.