| 11:31 am on Feb 9, 2009 (gmt 0)|
I've often wondered about Google traffic throttling.
I have a three year old site that averages 6k uniques a day since it's launch. It a user generated content site with 50-100 new pages added daily (over three years) and you would think that over time it would see at least a small increase in traffic.
Stuck at the 6k ceiling is making my tin-foil hat glow.
| 7:31 pm on Feb 9, 2009 (gmt 0)|
I totally agree with everyone who is seeing Google moderating the daily traffic figures. I've been watching this at numerous sites for several years and can see no other explanation for the very slight traffic variations at each site from month to month.
One variation to the pattern that I've noticed is that when a new site first goes online, the Googlebot visits, then when they believe that the site is close to its fulfillment (that is, the site is well beyond the initial design stage), they will as often as not bring it up higher in the SERPS than it deserves. We've called it the "Google Gift" here in the past.
But then after awhile it drops back to some position defined by the algo, and from that point on, the website mostly stays in a narrow zone, with some variation, but not a lot. Like others, I've tried adding new pages, updating text, etc, to mostly no affect -- the numbers remain too consistent to be coincidence.
| 10:33 pm on Feb 9, 2009 (gmt 0)|
|I think at some point you do break through this threshold, once you gain the right amount of points. Content alone will not get you where you want to go, I've tried just adding more good quality content to a couple of sites and the traffic remains about the same. |
I think these statements are important. It seems that there is a threshold, and content is not what gets you above it.
Are backlinks the answer then?
| 10:45 pm on Feb 9, 2009 (gmt 0)|
One factor may be backlinks that continue to AGE - freely given editorial links from well trusted sites, not from anything that looks like the webmaster arranged it, or that goes away after a few months.
In my wilder moments, I see this as a human editorial review factor. "Yes, the algo says this url should jump up. But when I eyeball the backlinks, something smells funny. Let's cap traffic (put the site on a yo-yo ranking) and see if the backlink growth sustains itself. Remind me to check it again in 3 months." Or so goes my fantasy.
The fix would still be real marketing, real development, getting your site known in the marketplace, networking and all that non-geeking old school stuff.
| 11:59 pm on Feb 9, 2009 (gmt 0)|
Tedster I think your so close it's scary. I did notice a a couple of mountain veiw checks including a site command shortly before the traffic dropped off. I had put these down to the adsense team in the past but who knows for real. I do expect it's a little more alog than human however, and we did have one site break through recently and it's now up around 15k per day. Lots and lots of "promotion" went into it.
| 2:46 am on Feb 10, 2009 (gmt 0)|
Everytime I think we're going to top out we make another move up in traffic. Our content is very good and is written for eyeballs, not search engines. We get links the old fashioned way.
I've mentioned this before, but a good website is like a fine red wine.
If you think that's throttling, then yes, I do believe it happens - but for some very good reasons too.
| 7:59 pm on Feb 12, 2009 (gmt 0)|
|In my wilder moments, I see this as a human editorial review factor. "Yes, the algo says this url should jump up. But when I eyeball the backlinks, something smells funny. Let's cap traffic (put the site on a yo-yo ranking) and see if the backlink growth sustains itself. Remind me to check it again in 3 months." Or so goes my fantasy. |
That's a fascinating idea. One of the basic principles that was emphasized over and over in my college physics courses was to isolate what you're measuring or calculating.
But isolating factors is something that's extremely hard to do in an environment like Google's, where there are hundreds of variables, probably difficult even if you're at Google. Eliminating a major variable that might complicate a test would certainly make sense.
Question I have for those who've experienced the traffic throttling... does it happen just on certain phrases (or the most productive phrases), or is something that happens across the board?
The yo-yoing that I've seen is entirely query specific.
| 8:57 am on Feb 13, 2009 (gmt 0)|
Ah, but if this is true, the KW-specific phrase will be high-vaule whereby the fiddling of it will keep the overall trafic within defined bounds.
So if a site gets 10k to 12k visits a day, and one query is responsible for 3-4k, thats the one that gets yo-yo'd. Thus G 'sticks' traffic at 10k by killing the rank as you approach the limit such that 'normal' variation will top out at 10k. Undershooting? Promote the KW. Overshoot, drop it down.
A very interesting concept.
| 5:09 am on Feb 14, 2009 (gmt 0)|
|Ah, but if this is true, the KW-specific phrase will be high-value whereby the fiddling of it will keep the overall trafic within defined bounds. |
These are in fact the kinds of phrases on which I'm seeing the yo-yo effect.
| 10:23 am on Feb 14, 2009 (gmt 0)|
How about looking at the flip side, you have a new site that starts to rank for a competitive high volume term but the traffic is just not happening. In effect I think your site is being used as "fill" or part of the bigger rotation of results that are keeping more established sites at a given level. We have one site now hitting the front page for at term that should deliver huge volumes of traffic but we are not seeing it all. It's real, its the modern version of the sandbox. Just because Google feeds my IP a given set of results does not mean everyone is seeing what I see. This is the way I am thinking this week. The tougher the vertical the greater the threshold of links and time required to break through it.
| This 40 message thread spans 2 pages: < < 40 ( 1  ) |