Forum Moderators: Robert Charlton & goodroi
Using a multivariate dataset, across a range of different keyphrases, user intents and user types, Google exposed our site in marginal but significant ways (putting us up one place, dropping Universal search, above or below shopping results, etc). They did this with (at least) four separate sets.
BIG uplift in traffic since 1st SEPT (20% above trend) with a corresponding drop in conversion rate, so sales were broadly static (on trend). Referrals shifted at precisely the same time. No visible change in ranking.
A referral shift on 16th Sept, another 6th October- both traffic and conversion neutral (relative to the 1st Sept).
Then the biggy. 12th October, huge referral shift. Traffic-neutral, but conversions back at pre-Sept level. In other words, we are now 20% up on sales. The referrals are NOT the same (or even particularly similar) to the pre-September level
1. How can traffic stay stable, even though conversions come in spurts and then die away?
2. Why do I see sudden changes in country sources for traffic, even though total traffic stays level?
3. Why do the UK SERPs have so many non-UK results at times?
4. Is Google throttling my traffic? Why doesn't it go up no matter what I do?
BIG uplift in traffic since 1st SEPT (20% above trend) with a corresponding drop in conversion rate
BIG uplift in traffic since 1st SEPT (20% above trend) with a corresponding drop in conversion rateSounds like cherry-picking the most likely to convert traffic and replacing it with low quality traffic. I wonder where the "good" traffic went. I also wonder if it's purely being siphoned off for financial gain or if your site metrics dictate if this happens to your site.
It might also just be bad luck but you're not the first to make that complaint.
scottsonline:by making your traffic conform to a curve it gives them more time to assess your siteSure, but that's not the same thing. I'm talking about a deliberate regime of testing, using ALOT of display-variables, with varying users, in varying contexts, against varying sites.
That's me. I look at an awful lot of data, but on self-reflection I find that after doing that, I tend to "go with my gut" -- some high-level abstraction of what I "feel" I'm seeing.
To watch it happening, you need a solid grasp of your site metrics- not just the headline figure, but the drilled-down detail.
Google is trying to become independently capable of discerning what I mean - in other words, to understand the connotation - without needing me to say, precisely and denotatively, what I want.
But what's happening from our perspective is that Google is learning to be smart like a person.
Clearly "localized" and "navigational" are two that have a kind of natural built-in granularity. But I'm kind of stumped as to how "informational" and "transactional" might be made more granular - with regard to the query phrase profile itself (not the website profile, that's another story).
They could even go as far as using that users profile and search history to try to determine the stage of the buying process they are at.