Forum Moderators: Robert Charlton & goodroi
To clarify my interpretation of Google Traffic Throttling before continuing: we get almost exact traffic amounts from Google and have done since late last year (Yahoo, Bing, Referral and Direct traffic always varies). The Google traffic graph is so predictable I could draw next week's and not be far off.
My understanding of how Google is accomplishing this with my site (I'm sure some sites are throttled differently) is moving us around in the rankings. We bounce from the ever-theorised-number-4 position to no-where, from No.9 to No.900 and so on throughout the day but the jumps are not predictable. I never know which page is going to be tanked from day to day but it is always a few decent traffic pages that suddenly vanish.
Why does this mean I have to reduce services? Well, almost 90% of my site's pages are user-based - industry news and consumer comments. Ads do not convert well on those pages. Our commercial sections convert very well.
This means that if our "traffic allowance" is used up by people visiting the news/discussion pages, we see a dramatic drop in visitors from Google to our commercial pages. If you put the news pages graph over the commercial pages graph it is simply amazing. The drops in commercial traffic correspond with increase in news traffic. When news traffic is low (Sat, Sun, Mon in our industry) the commercial traffic is much higher.
But the totals never pass our allotted Google traffic.
What would you do? Would you stop providing free services altogether so that your whole traffic allowance goes toward commercial pages?
Someone said that links get you out of traffic throttling, but after almost a year and thousands of links we have seen no movement ... other than an increase in our daily Google allowance of 100 visitors a couple of months ago. Is that the rate? Every 12 months and 5,000 links we are allowed another 100 visitors per day?
The effect I have seen between September 2008 and June 2009 is that the traffic to the top 4 levels has increased but there has been an almost exact corresponding drop in traffic to level 5.
I have checked in case some of the searchers that were reaching level 5 pages are now bringing searchers in through level 4, but that is not the case. The level 4 increase is entirely due to improvements in level 4 keywords.
Table below shows the average referals from Google per day to each of the levels.
Month Sep08 Jun09
Level1 ..19 ..25
Level2 ..22 ..26
Level3 ..55 .133
Level4 .873 1480
Level5 2506 1815
Totals 3475 3479
I use to believe in the "throttle" theory... until I starting a new way of looking at my site. I realized that yes my content was growing, but it was not branching out enough. Hence, I had maxed out my buckets.
after all, there are only so many people looking for "round blue single edge widgets" in any given 24 hour period.
If you have a set amount of PageRank, but double the number of pages on your site, then mathmatically it stands to reason that the amount of PageRank per page will be watered down by some amount
I think the title of this thread needs to be changed. As you can tell from the multiple comments like the ones above, people seem to be under the impression that the complaint is "I'm not getting enough traffic, Google must be keeping it back!" ...
... maybe tedster's usage of "plateau" would be more apt? We're not talking about low traffic. We're talking about Google cutting off traffic to a site once it reaches a certain magic number during the day. The referrals for each keyword sending traffic to the site fluctuates over time (as they should) but the overall traffic going to that site DOES NOT change over time.
e.g. Day 1: keyword 1 sends through 534 visitors
keyword 2 sends through 129 visitors
keyword 3 sends through 985 visitors
Total: 1648 visitors
e.g. Day 2: keyword 1 sends through 234 visitors
keyword 2 sends through 429 visitors
keyword 3 sends through 942 visitors
Total: 1605 visitors
e.g. Day 3: keyword 1 sends through 751 visitors
keyword 2 sends through 532 visitors
keyword 3 sends through 324 visitors
Total: 1607 visitors
each day the traffic from each keyphrase varies but the overall total/plateau/cut-off remains practically the same. That is the "throttling" I was referring to.
The above example is obviously a dramatic representation to get a point across.
Add to that Google regional search (Google will cut your site off if people are not searching in your region if applicable) and that might be the plateau you are seeing.
I think everyone is obsessing over this little bit and the bottom line is that if you want more traffic, you need to expand your content out.
If there are multiple pages ranked approximately equal for a particular query, test them at different positions and see if user satisfaction metrics are affected. Depending on query volume and the pages themselves, this test may take months to run.
Use Google's own Website Optimizer product, and then think about how they would programatically apply that to their own SERPs.
-- daily visitors
-- from Google organic Search
-- one keyword/url combinationmid - 1am 32 visits
1am - 2am 72 visits
2am - 3am 94 visits
3am - 4am 53 visits
4am - 5am 27 visits
5am - 6am 31 visits
6am - 7am 06 visits
7am - 8am 00 visits
8am - 9am 00 visits
9am - 10am 00 visits
10am - 11am 00 visits
11am - noon 00 visits
.
.
.
...all zeros until the next midnight
Repeat that pattern day after day and you have quite a dramatic picture. If I hadn't seen it, I wouldn't believe it.
Again, I don't think we're talking about anything very widespread at all, and I don't want to create major concerns or hypochondria about this. In the opening post, internetheaven said something very interesting:
I never know which page is going to be tanked from day to day but it is always a few decent traffic pages that suddenly vanish.
That to me is the most telling observation. There is some mechanism at work, even if we don't understand it all that well. If you want to know if your sites are being affected in this way, you should drill down in your analytics. Don't just assume that [traffic doesn't grow] = [intentional throttling]
[edited by: tedster at 6:39 pm (utc) on Aug. 13, 2009]
I have long suspected that I too experience the second kind, the kind described by internetheaven. I only get x,000 referrals from G per day, but there are long stretches of time during which the exact number is remarkably flat (it varies by day of the week, but for weekdays the total referrals from G are almost exactly the same for months at a time). I always assumed this was due to statistical averaging of user behavior across a wide range of factors, but I'll have to look at the data in more detail and see if there really is evidence of throttling.
If so, I can also support the theory that changes in internal link structure make a difference.
These waves seem to be regulated in such a way that practically every country page gets a chance to have higher traffic during certain periods of time every year irrespective to varying degree of competition that we have. I would normally expect to see the traffic correlated, to a large extent, to the population of those countries, that is, more traffic from highly populated countries and to the degree of competition that we have.
I don't complain about it as our overall traffic is not erratic and conversion rate is quite stable. It seems that an invisible hand, in the form of throttling, is at work.
I incline to believe that G is doing a kind of multivariate testing, as our Moderator pointed out, to see each website's relevance for its users from different angles, primarily based on keywords.
@tedsterWould it happen to be that the site in the table just above uses adwords? If that is the case, maybe they are clearing the decks for "paid clicks" during working hours.
If tedster has made that sort of schoolboy error that would be quite a problem considering how much stock people on WW boards in his input!
Perhaps, instead of debating the existence of throttling based on logs, we could clarify a test?
I'm curious as to what the "non-believers" would take as proof of traffic caps existing?
Still, I'm skeptical about the notion of "throttling," because I saw the same predictable referral patterns on Infoseek, Altavista, etc. back in the 1990s. I'm not a statistician, but I do know that retailers, airlines, etc. can predict their "traffic" (whether it's measured in dollars or passengers or something else) with remarkable accuracy. Just a few weeks ago, the catering manager of a cruise ship was telling me something that I've heard from his peers on other ships: He can predict how many people will order the surf-and-turf, how many will order the roast beef, etc. There's no need to "throttle" the numbers of surf-and-turfs or roast beefs served, because usage patterns have their own natural rhythm. If that's true of surf-and-turf and roast beef (or of airline passengers or a retail store's daily sales), why shouldn't it be true of search traffic?
I might venture to say if it is new website that it might be some sort of sandbox test where Google experiments a bit on search terms, but if it is an established site that is following webmaster guidelines and the site is trusted I would have a hard time believing it.
Ted, is the site established and trusted in the big Google's eyes? Also, on those keywords have you checked the serp's every hour to see where they sit ranking wise?
That could either be throttling or it could be the effect of different data sets being returned by google based on their server load balancing.
Some food for thought, have you ever noticed that sometimes when you make an edit to a page title a few days after posting an article that search results will jump between the new and the old page title for a while?
Different data sets being returned at different times could be perceived as throttling...
based on their server load balancing
I thought about this, but don't think it's so. Load balancing doesn't work by the clock, it works by network demand.
Different data sets being returned at different times could be perceived as throttling
Or that could be the technical mechanism for traffic throttling, too.
they have to be extremely sensitive to anti-trust issues and to situations where algo changes have hit the little guy very hard - was it last year that the new york times picked up the story about the small webmaster who's business was wiped out? we have all seen coverage like this.
there must be just as many sites with the the inverse of tedster's traffic graph - what about them? maybe google is using throttling as a fuse to soften the effects of its algo, pr etc changes.
maybe next time i want to get some real good serps i'll set my alarm for 3am ;)
was it last year that the new york times picked up the story about the small webmaster who's business was wiped out?
That story was discussed here at length, and as I recall, the Web site in question didn't win a "Rose of the Month" award when subjected to a sniff test.
As for Google using throttling to "soften the effects of its algo, pr etc. changes," how would that improve Google's core product (Google Web Search) and maintain the loyalty of its end users?
how would that improve Google's core product
But also, I think we have to acknowledge that in 2009, there are in all likelihood more than 10 quality sites for most queries (admittedly not all). Therefore, moving things around can be seen as improving the core product -- like any other website or service, Google needs to be seen as dynamic, with fresh content. I would hazard the guess that a lot of people like the idea of seeing new SERPs from time to time, as it shows them options they may not have otherwise discovered.
So if looked at from that point of view, fresh SERPs satisfies the searcher; spreads the traffic around to more websites; and makes Google seem more up to date. But again, that can be true only if all the top listed sites provide quality content of approximately equal value.
........................