Forum Moderators: Robert Charlton & goodroi
Earlier this month our site jumped up in rankings quite considerably across a couple of hundred pages (around 500 pages on the site) and across some pretty hefty keyphrases (millions of searches per month) yet we have not seen a traffic increase.
I mean, we've gone from numbers like #53 and #162 to #2 and #11 - surely you see an increase in traffic from that type of jump across hundreds of pages and millions of searchers?
To confirm: the ranking jumps I'm seeing occur when I'm logged into my Google account AND when I'm logged out AND when I use someone else's computer in the next town.
I have a three year old site that averages 6k uniques a day since it's launch. It a user generated content site with 50-100 new pages added daily (over three years) and you would think that over time it would see at least a small increase in traffic.
Stuck at the 6k ceiling is making my tin-foil hat glow.
One variation to the pattern that I've noticed is that when a new site first goes online, the Googlebot visits, then when they believe that the site is close to its fulfillment (that is, the site is well beyond the initial design stage), they will as often as not bring it up higher in the SERPS than it deserves. We've called it the "Google Gift" here in the past.
But then after awhile it drops back to some position defined by the algo, and from that point on, the website mostly stays in a narrow zone, with some variation, but not a lot. Like others, I've tried adding new pages, updating text, etc, to mostly no affect -- the numbers remain too consistent to be coincidence.
...............................
I think at some point you do break through this threshold, once you gain the right amount of points. Content alone will not get you where you want to go, I've tried just adding more good quality content to a couple of sites and the traffic remains about the same.
I think these statements are important. It seems that there is a threshold, and content is not what gets you above it.
Are backlinks the answer then?
In my wilder moments, I see this as a human editorial review factor. "Yes, the algo says this url should jump up. But when I eyeball the backlinks, something smells funny. Let's cap traffic (put the site on a yo-yo ranking) and see if the backlink growth sustains itself. Remind me to check it again in 3 months." Or so goes my fantasy.
The fix would still be real marketing, real development, getting your site known in the marketplace, networking and all that non-geeking old school stuff.
I've mentioned this before, but a good website is like a fine red wine.
If you think that's throttling, then yes, I do believe it happens - but for some very good reasons too.
In my wilder moments, I see this as a human editorial review factor. "Yes, the algo says this url should jump up. But when I eyeball the backlinks, something smells funny. Let's cap traffic (put the site on a yo-yo ranking) and see if the backlink growth sustains itself. Remind me to check it again in 3 months." Or so goes my fantasy.
That's a fascinating idea. One of the basic principles that was emphasized over and over in my college physics courses was to isolate what you're measuring or calculating.
But isolating factors is something that's extremely hard to do in an environment like Google's, where there are hundreds of variables, probably difficult even if you're at Google. Eliminating a major variable that might complicate a test would certainly make sense.
Question I have for those who've experienced the traffic throttling... does it happen just on certain phrases (or the most productive phrases), or is something that happens across the board?
The yo-yoing that I've seen is entirely query specific.
So if a site gets 10k to 12k visits a day, and one query is responsible for 3-4k, thats the one that gets yo-yo'd. Thus G 'sticks' traffic at 10k by killing the rank as you approach the limit such that 'normal' variation will top out at 10k. Undershooting? Promote the KW. Overshoot, drop it down.
A very interesting concept.