| 3:05 am on Oct 7, 2010 (gmt 0)|
I wonder if what we think is throttling is really google sending searchers to different unknown dc's that have different result sets?
I'm absolutely puzzled by the last two days. Ive spent hours today looking through data and will post some interesting findings on the main serp thread later. There was clearly some type of algo or filter tweak in the last week and as usual it's normalizing again.
What goolge does still amazes me, the amount of data it sorts is staggering but I do feel they need to smooth out the changes a little better.
| 10:57 am on Oct 7, 2010 (gmt 0)|
One of my sites lost about 50% traffic on Tuesday. Most of its paid referals weren't showing up in Analytics. It seems to have sort of recovered last night though. Something is/was going on...
| 2:11 pm on Oct 7, 2010 (gmt 0)|
@drall, two quick questions.
Are you still seeing the same pattern in the site's traffic numbers?
|we monetize the site via a combination of 70% direct ad sales, 20% adsense ads and a handful of direct discrete affiliate agreements 10% |
Not implying anything here, but is there any possibility that, say, the 70% direct ad sales, can be construed as or mistaken for as paid links? I only ask because of a blast from the past moment when reading a summary of an SMX session.
| 3:52 pm on Oct 7, 2010 (gmt 0)|
|but is there any possibility that, say, the 70% direct ad sales, can be construed as or mistaken for as paid links? |
Just following up clarifying on this: do those links have a nofollow tag on them?
If not, could google confuse them for paid links?
| 8:39 pm on Oct 7, 2010 (gmt 0)|
This whole traffic up one day and then down the next day (when you have the same SERPs) is really getting to be annoying. But I do notice that this only happens to certain set of sites not all of them. Some sites are the same about everyday without any spikes either up or down. I wonder if we can attribute this to G showing SERPs from a different DC to majority of the people and on those DCs the sites aren't ranking as high.
| 9:01 pm on Oct 7, 2010 (gmt 0)|
Hmmmm... I wasn't intending to imply that the GA glitch was actually _related_ to the disturbances in the SERPS, but that is an interesting idea to be sure.
As has been pointed out before, GA data can't be a direct part of the algo because it could not be applied to sites without GA. However, suppose they use GA data as a quality check on the algo -- there are plenty of good "signals of quality" available from GA that would help to test whether algo tweaks are really returning higher quality results. If the quality check was automated, you might get some sort of cross-mojonation.
| 6:38 pm on Oct 8, 2010 (gmt 0)|
My brain hurts! I've been pouring over my raw access logs for several sites and it seem that Google.com is sending me no traffic whatsoever, at the moment other some country-specific Googles are, but not much.
I've changed nothing to most of these sites recently and the ones that I have were simply a couple of renaming of extensions and images to stop hotlinking.
Is Google pi$$ed with me for making a 302 on all my images?
A couple of these sites date from 1994 and 1995!
| 9:18 pm on Oct 19, 2010 (gmt 0)|
After a few recent observations in the last few weeks on a number of site's that have been experiencing speed difficulties, my thought is that the " shaping / throttling " of traffic by Google could be considered a good thing.
Those sites appear to generate more business, when not inundated with requests they can't handle. The ideal is that they perform no matter how large the traffic.
Google's priority is in with serving quality results , and if a site is not responding well within reasonable times, then it makes no sense for Google to increase it's traffic for a bad user experience.
Google has a good idea about a site's crawl capabilities, so why would it treat display in the SERP's any differently.
Does anyone have similar observations of how sites that don't handle peak traffic well, might have had their traffic " throttled" or "shaped".
| 6:31 am on Nov 11, 2010 (gmt 0)|
whitey, I assure you that my site can handle 2x the visits without a milisec in delay so that is not it :)
| 9:30 am on Nov 11, 2010 (gmt 0)|
That's interesting feedback , but i leave the question out there for others , even though there's been no uptake.
Just to add some more perspective, since place search has come in I'm observing sites with major geo trophy terms loose 8-9 places + , and these terms drove traffic.
But the overall traffic remains constant. So what's Google doing ?
| 11:15 am on Nov 11, 2010 (gmt 0)|
I think about how many sites have the ability and links to rank for the top place for almost any given search, more and more every day you would think. Google can share the love around as much as it likes and still show the "webmaster" the results he wants to see... Around here we call it the JO filter.
| 2:38 pm on Dec 30, 2010 (gmt 0)|
I wish I had "too much traffic". I run a dedicated server which this time last year was utilized at about a medium level. This year, thanks to all the changes, either economic, social network distraction or Google algo changes the server is now way under utilized. The traffic spigot turns on and off at apparently random times.
Yesterday it was dead from 7 am until 7pm, then at 7pm, traffic opened up and sales flooded in. Even made 4 sales in the wee early morning hours. Now it's 8 am and the server is quiet as a mouse. Last year this time it would have been banging in the mornings. In fact, two day prior the pattern was reversed, 7am to 7pm was crazy, then poof, nothing. This pattern may change tomorrow, but one thing is certain, there is no real pattern that might help us determine the cause. This is not the natural sinusoidal traffic pattern we have tracked for the past decade.
| 7:51 pm on Dec 30, 2010 (gmt 0)|
You more than most seem to be in a position to verify the "throttle effect", with a dedicated server and a decade of data behind you. But here's my question: Is it only Google that is turning on & off? OR, are you seeing it from multiple sources (G, Bing, Yahoo, etc)? If it's only G, then it appears from your posting that your biz is highly dependant on Google traffic, to the point where they can shut you down if they shut you out. If however you see the decline across the boards, then the water gets very muddy and I wonder, how can that be?
|The traffic spigot turns on and off at apparently random times. |
| 9:58 pm on Dec 30, 2010 (gmt 0)|
So very few people have been reporting this effect that it really makes me wonder if we got it right. As I mentioned in another thread, traffic turning on and off, taken on its own, sounds more like DNS Cache Poisoning [webmasterworld.com] than something Google is doing.
On the other hand, a sustained level of traffic with wide variations in country of origin, or even alternating between strong conversions and poor conversions - that might well be Google if it occurs over a large enough sample size.
There's a certain appeal to targeting Google in situations like this, but we really don't have a "smoking gun" so far - nothing that rules out non-Google explanations.
For some solid data, take another look at drall's account here: [webmasterworld.com...] for a site with a PR 8 home page. Yet even that could be accounted for by some black-hat DNS cache games.
As I see it, Google can't throttle traffic (clicks) unless they are also cycling a ranking position off and on - but not one of these reports has caught this happening. Does anyone have anything like that to report?
|azn romeo 4u|
| 6:12 pm on Dec 31, 2010 (gmt 0)|
Starting in October something weird happen.
[i.imgur.com...] (last 9 months)
[i.imgur.com...] (last 120 days)
My traffic was steady for one site at about 2.1 million page views a month. Then it went up and up. December is a record for me. First time ever for that site passing up 3million page views.
The last 3 months, I saw an increase like this.
10k more visitors in october, than september
20k more visitors in november, than ocotober
40k more visitors in december, than october.
Basically I been increasing double the previous high. Probably nothing, but interesting to see.
Also I don't use Google Analytic, since IMO, that's also one way Google controls traffic to your site. I noticed this 3 years ago when every time I have Google Analytic on my site, my traffic would die off.
I was called a conspiracy theorist but whatever man. That's what I felt like was happening.
| 7:35 pm on Dec 31, 2010 (gmt 0)|
I threw this up on the home page as a nominee for thread of the year.
Why? There seems to be something here. I don't know if we have the story right or not yet. Usually by this point, there has grown a consensus of what it means. That alone makes this worth looking at further.
| 7:47 pm on Dec 31, 2010 (gmt 0)|
|Also I don't use Google Analytic, since IMO, that's also one way Google controls traffic to your site. I noticed this 3 years ago when every time I have Google Analytic on my site, my traffic would die off. |
I've been considering removing Analytics to see what effect, if any, doing that might have.
So the question is, how many of you are seeing the throttling without having Analytics set up?
And Happy New Year to all!
| 8:41 pm on Dec 31, 2010 (gmt 0)|
|As I mentioned in another thread, traffic turning on and off, taken on its own, sounds more like DNS Cache Poisoning [webmasterworld.com] than something Google is doing. |
Just went back to reread that thread again (what a read) and, at a first look, does sound like it could be a cause. But then I have to go back to drall's numbers which were what got me interested in the throttling topic in the first place. I just can't see/understand how a highjacked cache would produce that almost exact 36K Google referrals per day. And if it were a highjack, wouldn't it also affect referrals from other SEs? Haven't seen any mentions of problems with Bing, maybe just not looked for.
| 8:45 pm on Dec 31, 2010 (gmt 0)|
|I threw this up on the home page as a nominee for thread of the year. |
Reading the home page info and thinking back a bit made me remember something about time of the day, day of the week, and specific result clicks having to do with rankings... I don't remember the entirety, but it was part of one of the patent applications from a few years ago.
It made me start wondering if the 'throttling effect' is so hard to pin down and understand because it's not 'throttling' directly, but rather time-dependent result sets of some type...
It doesn't explain the 'totally level' traffic described in one of the situations, but I'm wondering if what appears to be a 'throttling effect' maybe has a different cause?
| 9:09 pm on Dec 31, 2010 (gmt 0)|
realtime keyword position tracking :)
| 9:19 pm on Dec 31, 2010 (gmt 0)|
|So the question is, how many of you are seeing the throttling without having Analytics set up? |
I have it up but if they're using analytics to throttle/monitor traffic levels....couldn't they use adsense and things like the doubleclick ad management software as well? I mean you'd have to strip google completely from your site in order to know whether or not they're using the data their services collect against you.
| 9:38 pm on Dec 31, 2010 (gmt 0)|
There's another totally excellent thread that touches very potently on this topic - Google & Traffic Shaping [webmasterworld.com]. In that thread, Shaddows shares some very evidential analysis, not just about the total amount of traffic but about its quality.
|Shaddows: Using a multivariate dataset, across a range of different keyphrases, user intents and user types, Google exposed our site in marginal but significant ways (putting us up one place, dropping Universal search, above or below shopping results, etc). They did this with (at least) four separate sets. |
So far, this is the only analysis of traffic shaping that includes a look at ranking changes. If Google is involved with this type of thing, then in my opinio, ranking changes MUST be part of the picture. If rankings stay the same but traffic shows major disturbances, I just can't see laying the cause at Google's feet.
If traffic shaping is real, Google's goal would most likely be to better serve their users, not a program that targets webmasters. Of course, webmasters would still feel the effect, whether positive or negative.
This was my initial reply to Shaddows in that thread:
|We talked about 3 very big buckets of intention "informational, navigational and transactional" - although I'm sure Google has a much more refined set of user intention buckets than this. Another user intention could be "locational". There's little doubt that some queries have an implied geographic component. |
Here's the missing piece in that analysis. In order to tailor specific SERPs to specific user intentions, Google must also assign each website, and possibly each URL, to a specific taxonomy. Only then would they understand which type of page should be returned to which type of user intention.
It seems to me that Google has cranked up some kind of statistical testing - one that tries out a given page against different types of query intentions, and then takes note of the results. After a while, they could discover which intention taxonomy works best and then make a more stable assignment of website type - and some pages might have more than one type.
Yes, this thread is not the place for complaining.
Serious analysis only, please!
If this is the case (and yes, it is definitely in the area of a conjecture, not something we've proven) then I would expect most sites who get this treatment to stabilize relatively quickly as Shaddows reported. If a site stays stuck in such a pattern, then my guess would be that either Google can't get statistically relevant information for some reason (weak signals of some kind) - or my analysis is just plain wrong - that's always a possibility ;)
| 7:05 pm on Jan 8, 2011 (gmt 0)|
I basically just want to plop this here as a reference because it is something that at first look can appear as traffic throttling. From a Webmaster Tools discussion [google.com] question answered by Goog employee AsaphZemach:
|Why are the number of impressions (24,900,000) for our web site stays exactly the same for the last three months? |
When you see 24,900,000 you should really think of it as 25M +/- 2.5M. Which tells you that the number of impressions has remained within plus or minus 10% for the duration you are referring to.
Somewhat tantalizing, but would really need duration for above and then daily/weekly breakdowns to draw any sort of conclusion as to possibility of "throttling."
via RustyBrick at SER
| This 233 message thread spans 8 pages: < < 233 ( 1 2 3 4 5 6 7  ) |