Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Google Traffic Throttling means we have to reduce user services

         

internetheaven

9:28 am on Aug 5, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Been working on this most of this year and sadly, data shows that when we provide more services for users in our free sections, traffic to our commercial sections drop.

To clarify my interpretation of Google Traffic Throttling before continuing: we get almost exact traffic amounts from Google and have done since late last year (Yahoo, Bing, Referral and Direct traffic always varies). The Google traffic graph is so predictable I could draw next week's and not be far off.

My understanding of how Google is accomplishing this with my site (I'm sure some sites are throttled differently) is moving us around in the rankings. We bounce from the ever-theorised-number-4 position to no-where, from No.9 to No.900 and so on throughout the day but the jumps are not predictable. I never know which page is going to be tanked from day to day but it is always a few decent traffic pages that suddenly vanish.

Why does this mean I have to reduce services? Well, almost 90% of my site's pages are user-based - industry news and consumer comments. Ads do not convert well on those pages. Our commercial sections convert very well.

This means that if our "traffic allowance" is used up by people visiting the news/discussion pages, we see a dramatic drop in visitors from Google to our commercial pages. If you put the news pages graph over the commercial pages graph it is simply amazing. The drops in commercial traffic correspond with increase in news traffic. When news traffic is low (Sat, Sun, Mon in our industry) the commercial traffic is much higher.

But the totals never pass our allotted Google traffic.

What would you do? Would you stop providing free services altogether so that your whole traffic allowance goes toward commercial pages?

Someone said that links get you out of traffic throttling, but after almost a year and thousands of links we have seen no movement ... other than an increase in our daily Google allowance of 100 visitors a couple of months ago. Is that the rate? Every 12 months and 5,000 links we are allowed another 100 visitors per day?

thord

8:51 am on Aug 13, 2009 (gmt 0)

10+ Year Member Top Contributors Of The Month



Please clarify re throttling: are you all the time talking about google.com exclusively, or Google worldwide? The amount of image searches are presumably excluded from the analysis. For my own sites google.com is always in a clear minority, with searches from G centres in a significant number of foreign countries dominating in the aggregate. (G visits # stable, but not fixed.)

kidder

9:28 am on Aug 13, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



In recent times we had a site go from 2500 - 3000 uniques (Google) per day to over 10,000 per day. It was like we crossed a threshold of sorts. There was no huge shift in rankings just a dam busting type change and a flood of traffic pretty much in the space of a few days. We suspect to some extent we were released from the sandbox, the new improved version if you like. One of our other large sites is a 2000 domain and gets new content daily, the tld has a glass ceiling without doubt. I think the way traffic is throttled depends very much on how much inventory Google has in any given vertical to "improve" the results. I've seen our pages pop for terms they should not be ranking for, the conclusion we draw from this is that we are being used as "fill" in order to displace another site that may be at quota. Makes sense yeah? It's real so how do you break the shackles? Now that is the real question and I suspect the answer lies in breaking the link / age / trust threshold that applies to the space your playing in. If it's real estate or travel related then it may take a lot more work than something right off the grid like "snail hunting in Samoa" for example... Just my thoughts on the subject.

olias

12:50 pm on Aug 13, 2009 (gmt 0)

10+ Year Member



One of my sites is essentially a 5 level structure, which hadn't really changed over the last year or so in terms of architecture. I have made some on page optimisations and the site has been very slowly picking up new links. TBPR went up one for the front page a few months back.

The effect I have seen between September 2008 and June 2009 is that the traffic to the top 4 levels has increased but there has been an almost exact corresponding drop in traffic to level 5.

I have checked in case some of the searchers that were reaching level 5 pages are now bringing searchers in through level 4, but that is not the case. The level 4 increase is entirely due to improvements in level 4 keywords.

Table below shows the average referals from Google per day to each of the levels.


Month Sep08 Jun09
Level1 ..19 ..25
Level2 ..22 ..26
Level3 ..55 .133
Level4 .873 1480
Level5 2506 1815
Totals 3475 3479

internetheaven

2:06 pm on Aug 13, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I use to believe in the "throttle" theory... until I starting a new way of looking at my site. I realized that yes my content was growing, but it was not branching out enough. Hence, I had maxed out my buckets.

after all, there are only so many people looking for "round blue single edge widgets" in any given 24 hour period.

If you have a set amount of PageRank, but double the number of pages on your site, then mathmatically it stands to reason that the amount of PageRank per page will be watered down by some amount

I think the title of this thread needs to be changed. As you can tell from the multiple comments like the ones above, people seem to be under the impression that the complaint is "I'm not getting enough traffic, Google must be keeping it back!" ...

... maybe tedster's usage of "plateau" would be more apt? We're not talking about low traffic. We're talking about Google cutting off traffic to a site once it reaches a certain magic number during the day. The referrals for each keyword sending traffic to the site fluctuates over time (as they should) but the overall traffic going to that site DOES NOT change over time.

e.g. Day 1: keyword 1 sends through 534 visitors
keyword 2 sends through 129 visitors
keyword 3 sends through 985 visitors
Total: 1648 visitors

e.g. Day 2: keyword 1 sends through 234 visitors
keyword 2 sends through 429 visitors
keyword 3 sends through 942 visitors
Total: 1605 visitors

e.g. Day 3: keyword 1 sends through 751 visitors
keyword 2 sends through 532 visitors
keyword 3 sends through 324 visitors
Total: 1607 visitors

each day the traffic from each keyphrase varies but the overall total/plateau/cut-off remains practically the same. That is the "throttling" I was referring to.

The above example is obviously a dramatic representation to get a point across.

trinorthlighting

3:39 pm on Aug 13, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



That example shows there is an daily average of people searching keywords. You see the difference in each keyword for each day, but if those terms are similar I would fully expect this type of outcome.

Add to that Google regional search (Google will cut your site off if people are not searching in your region if applicable) and that might be the plateau you are seeing.

I think everyone is obsessing over this little bit and the bottom line is that if you want more traffic, you need to expand your content out.

whoisgregg

4:23 pm on Aug 13, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



What we perceive as throttling Google would probably call multivariate testing.

If there are multiple pages ranked approximately equal for a particular query, test them at different positions and see if user satisfaction metrics are affected. Depending on query volume and the pages themselves, this test may take months to run.

Use Google's own Website Optimizer product, and then think about how they would programatically apply that to their own SERPs.

tedster

6:33 pm on Aug 13, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



This is what I'm calling throttling:

-- daily visitors
-- from Google organic Search
-- one keyword/url combination

mid - 1am 32 visits
1am - 2am 72 visits
2am - 3am 94 visits
3am - 4am 53 visits
4am - 5am 27 visits
5am - 6am 31 visits
6am - 7am 06 visits
7am - 8am 00 visits
8am - 9am 00 visits
9am - 10am 00 visits
10am - 11am 00 visits
11am - noon 00 visits
.
.
.
...all zeros until the next midnight

Repeat that pattern day after day and you have quite a dramatic picture. If I hadn't seen it, I wouldn't believe it.

Again, I don't think we're talking about anything very widespread at all, and I don't want to create major concerns or hypochondria about this. In the opening post, internetheaven said something very interesting:

I never know which page is going to be tanked from day to day but it is always a few decent traffic pages that suddenly vanish.

That to me is the most telling observation. There is some mechanism at work, even if we don't understand it all that well. If you want to know if your sites are being affected in this way, you should drill down in your analytics. Don't just assume that [traffic doesn't grow] = [intentional throttling]

[edited by: tedster at 6:39 pm (utc) on Aug. 13, 2009]

trinorthlighting

6:38 pm on Aug 13, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



How does this example compare to Yahoo and Bing traffic?

tedster

6:40 pm on Aug 13, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yahoo and Bing show traffic throughout the day on that keyword/url. It's definitely a Google-related pattern.

freejung

6:49 pm on Aug 13, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I think tedster and internetheaven are talking about slightly different things. tedster, you're referring to a cutoff for a particular keyword once it reaches a certain number of clickthroughs. internetheaven, you're talking about G adjusting your rankings across a variety of different keywords to achieve the exact same amount of traffic to the entire site each day.

I have long suspected that I too experience the second kind, the kind described by internetheaven. I only get x,000 referrals from G per day, but there are long stretches of time during which the exact number is remarkably flat (it varies by day of the week, but for weekdays the total referrals from G are almost exactly the same for months at a time). I always assumed this was due to statistical averaging of user behavior across a wide range of factors, but I'll have to look at the data in more detail and see if there really is evidence of throttling.

If so, I can also support the theory that changes in internal link structure make a difference.

tedster

7:03 pm on Aug 13, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I think I'm talking one example of a pattern that internetheaven sees on different keywords at different times. So our examples are most likely related - mine is more granular, and his is an aggregate.

Josun

7:01 am on Aug 14, 2009 (gmt 0)

10+ Year Member



We have a website with pages divided almost equally for people from different countries, with similar format and slightly different content. Traffic moves from one set of countries to another while we make no changes in the contents or formatting of those pages, total traffic staying about the same.

These waves seem to be regulated in such a way that practically every country page gets a chance to have higher traffic during certain periods of time every year irrespective to varying degree of competition that we have. I would normally expect to see the traffic correlated, to a large extent, to the population of those countries, that is, more traffic from highly populated countries and to the degree of competition that we have.

I don't complain about it as our overall traffic is not erratic and conversion rate is quite stable. It seems that an invisible hand, in the form of throttling, is at work.

I incline to believe that G is doing a kind of multivariate testing, as our Moderator pointed out, to see each website's relevance for its users from different angles, primarily based on keywords.

plumsauce

10:57 am on Aug 14, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



@tedster

Would it happen to be that the site in the table just above uses adwords? If that is the case, maybe they are clearing the decks for "paid clicks" during working hours.

internetheaven

2:00 pm on Aug 14, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



@tedster

Would it happen to be that the site in the table just above uses adwords? If that is the case, maybe they are clearing the decks for "paid clicks" during working hours.

If tedster has made that sort of schoolboy error that would be quite a problem considering how much stock people on WW boards in his input!

Perhaps, instead of debating the existence of throttling based on logs, we could clarify a test?

I'm curious as to what the "non-believers" would take as proof of traffic caps existing?

physics

3:48 pm on Aug 14, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I usually don't buy into a lot of these theories but this one explains something. My rankings have been steady for a certain term and then recently they started dropping for no apparent reason. Then I checked my stats and my traffic from Google for that term had actually INCREASED a little. So, perhaps, search traffic for that term started increasing and Google moved me down in the rankings to adjust how much traffic I'm getting. Very interesting idea... Hard to complain since I'm getting a little more traffic but it is sad to see my rankings go down nonetheless.

signor_john

4:55 pm on Aug 14, 2009 (gmt 0)



I see week-to-week similarities for the same day of the week, but there are considerable variations within any given week. So, if Google is "throttling" my traffic, it's using a different redline on Monday or Tuesday than it is on Friday or Saturday.

Still, I'm skeptical about the notion of "throttling," because I saw the same predictable referral patterns on Infoseek, Altavista, etc. back in the 1990s. I'm not a statistician, but I do know that retailers, airlines, etc. can predict their "traffic" (whether it's measured in dollars or passengers or something else) with remarkable accuracy. Just a few weeks ago, the catering manager of a cruise ship was telling me something that I've heard from his peers on other ships: He can predict how many people will order the surf-and-turf, how many will order the roast beef, etc. There's no need to "throttle" the numbers of surf-and-turfs or roast beefs served, because usage patterns have their own natural rhythm. If that's true of surf-and-turf and roast beef (or of airline passengers or a retail store's daily sales), why shouldn't it be true of search traffic?

trinorthlighting

6:07 pm on Aug 14, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I do not buy into this either, we have seen no evidence of it on our 5 websites and we have checked, there is no cut off where the traffic ever goes to 0 on any of our monitored keywords. Every hour people hit us on our keywords from all over the world.

I might venture to say if it is new website that it might be some sort of sandbox test where Google experiments a bit on search terms, but if it is an established site that is following webmaster guidelines and the site is trusted I would have a hard time believing it.

Ted, is the site established and trusted in the big Google's eyes? Also, on those keywords have you checked the serp's every hour to see where they sit ranking wise?

JS_Harris

9:06 am on Aug 16, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Tedster: re "This is what I'm calling throttling".

That could either be throttling or it could be the effect of different data sets being returned by google based on their server load balancing.

Some food for thought, have you ever noticed that sometimes when you make an edit to a page title a few days after posting an article that search results will jump between the new and the old page title for a while?

Different data sets being returned at different times could be perceived as throttling...

tedster

9:44 am on Aug 16, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



based on their server load balancing

I thought about this, but don't think it's so. Load balancing doesn't work by the clock, it works by network demand.

Different data sets being returned at different times could be perceived as throttling

Or that could be the technical mechanism for traffic throttling, too.

wingslevel

1:36 pm on Aug 18, 2009 (gmt 0)

10+ Year Member



clearly google started off in the search arena as 'david', and now, with some 80% share, they are goliath.

they have to be extremely sensitive to anti-trust issues and to situations where algo changes have hit the little guy very hard - was it last year that the new york times picked up the story about the small webmaster who's business was wiped out? we have all seen coverage like this.

there must be just as many sites with the the inverse of tedster's traffic graph - what about them? maybe google is using throttling as a fuse to soften the effects of its algo, pr etc changes.

maybe next time i want to get some real good serps i'll set my alarm for 3am ;)

signor_john

3:12 pm on Aug 18, 2009 (gmt 0)



was it last year that the new york times picked up the story about the small webmaster who's business was wiped out?

That story was discussed here at length, and as I recall, the Web site in question didn't win a "Rose of the Month" award when subjected to a sniff test.

As for Google using throttling to "soften the effects of its algo, pr etc. changes," how would that improve Google's core product (Google Web Search) and maintain the loyalty of its end users?

Reno

3:32 pm on Aug 18, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



how would that improve Google's core product

This has to be the bottom line criteria -- nothing will change from Google's point of view unless that change enhances or cements the usability & benefits of its service. That's why, earlier on in this thread, I emphasized that if throttling is happening (on any level, at any time), then it will only work for Google to do this if they are moving up quality sites that give the searcher what that person is seeking. To move quality sites down without a peer replacement is self defeating.

But also, I think we have to acknowledge that in 2009, there are in all likelihood more than 10 quality sites for most queries (admittedly not all). Therefore, moving things around can be seen as improving the core product -- like any other website or service, Google needs to be seen as dynamic, with fresh content. I would hazard the guess that a lot of people like the idea of seeing new SERPs from time to time, as it shows them options they may not have otherwise discovered.

So if looked at from that point of view, fresh SERPs satisfies the searcher; spreads the traffic around to more websites; and makes Google seem more up to date. But again, that can be true only if all the top listed sites provide quality content of approximately equal value.

........................

tedster

7:23 pm on Aug 18, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



In many cases, when the throttled site disappears it's former ranking position is taken over by a Universal Search result.

internetheaven

8:44 pm on Aug 25, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



In many cases, when the throttled site disappears it's former ranking position is taken over by a Universal Search result.

... or 2 ... or 4 as I saw today! News and 3 youtube results ... for a commercial term!

This 84 message thread spans 3 pages: 84