Welcome to WebmasterWorld Guest from 54.197.130.93

Forum Moderators: Robert Charlton & andy langton & goodroi

Featured Home Page Discussion

Are search results being throttled ?

     
9:44 pm on Nov 29, 2016 (gmt 0)

New User from US 

joined:June 28, 2016
posts:11
votes: 1


A weird trend i've noticed recently... (from someone that analyzes his stats several times days )

It almost seems like our positions lately are being throttled. (our as in everybody's)

No more are the days where you were #1 or #2 forever on google... it's like once google realizes you've had a certain amount of hits per day, your results get throttled back some...and others gain forward.

thoughts ?
1:23 am on Nov 30, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member aristotle is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Aug 4, 2008
posts:3083
votes: 208


Well people here have been talking about throttling for years. They've even shown charts that appear to show their traffic bumping up against a ceiling. I've never seen it myself, and have never really understood the mechanism or the purpose.

So there are two questions:

1. Why would google do it?

2. How does google do it, if in fact they do?
2:03 am on Nov 30, 2016 (gmt 0)

New User from US 

joined:June 28, 2016
posts:11
votes: 1


1.) Notice how every google search result has their own link embedded. So their instantly, automatically ,and always aware of how many hits / visitors every site is getting at any time. Even though a search result only shows that their domain name is only being clicked on , it certainly never is. Every link clicked on from google search starts with similar -
[google.com...]

2.) An automatic algorithm
4:03 am on Nov 30, 2016 (gmt 0)

Preferred Member

Top Contributors Of The Month

joined:Oct 29, 2012
posts:359
votes: 41


It has been like this for as long as I remember, 3~4 years now, maybe the trend is that it's pushing out to more sites and more niches, my niches are not that all important and probably first to get experimented with. Throttling works as a Plateau (both as a resistance or base), you either break through it to reach a new traffic level, or you dwindle down from your ceilings. Dampening factor comes into mind. Be prepared to stay at this base for a while...

I would believe Google does it to prevent any one site from getting 100% of the pie. A/B testing, Diversity, data points, etc. Why would Google "not" throttle it to their advantage, when they could? They do have completely control over the gate. It's a double edged sword and probably a byproduct of something else algorithmic. Protects dominant sites from instantly being taken over, and prevent #*$!ty sites taking over positions that they should not. It works as intended most of the time.

Most obvious tell that I have seen personally is that SERP for certain sites can disappear completely after a while, after certain amount of visits have been achieved. Especially for long tail searches. This way Google can preserve webmaster's impression that SERP position is #xxx, while the traffic volume isn't really at that. So where is the traffic? The searchers will have to land "somewhere".

I stopped worrying about this for a while now. It's not within our control. Better UI / content and link exposures are probably the best way to spend energy and time.
4:43 am on Nov 30, 2016 (gmt 0)

Full Member

10+ Year Member

joined:May 25, 2006
posts:261
votes: 23


I have never seen this, and to be honest can't see why google would have 'number of visitors already sent to site' as some kind of ranking factor, although automated processes that continually compare different sets of results could perhaps have this effect?

We usually have extremely constant and predictable traffic volumes (information sites), but if for example a news story provides a ´burst' of a few thousand visitors in an hour I have never seen a sign of traffic later in the day being throttled as a result.

"This way Google can preserve webmaster's impression that SERP position is #xxx, while the traffic volume isn't really at that"

Also can't see why they would bother doing this - why do they care what a webmaster thinks their SERP position is?

For those that are seeing it, it should be easy to provide examples of analytics graphs or figures that prove (statistically demonstrate) the effect which might make it easier for others to see how it operates (eg does traffic stop completely after 10000 visits in a day or slowly taper off, does it operate country by country etc).

Once the throttling can be tested and consistently demonstrated by a variety of sites it might be possible to identify the exact circumstances when it takes place.

It would be interesting to understand why many webmasters are convinced they see traffic throttling (or periodic restrictions on converting traffic, as often mentioned in other threads that could be related) - since both these effects clearly seem implausible to those that don't see them, but very real to others.

Although as frankleeceo says, focussing on content etc might be a more productive use of time!
11:29 am on Nov 30, 2016 (gmt 0)

Preferred Member

Top Contributors Of The Month

joined:Sept 12, 2014
posts:377
votes: 64


Why MIGHT this be happening? If the number of searches is dropping it would have a devastating effect on their stock value. Spreading or rationing the searches to a bunch of different websites might help hide the loss of searches. Remember, there is a business entity behind the algo.
12:41 pm on Nov 30, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Feb 12, 2006
posts:2587
votes: 65


I can see why they *might* want to change the rankings throughout the day.
To pick an obvious example, if you search for "children's games" in the early afternoon then google might assume you're a kid wanting to play them, whereas if you do it at ten o'clock at night you're probably an adult wanting to buy them. So your rankings and traffic might plummet periodically each day
12:58 pm on Nov 30, 2016 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Feb 25, 2004
posts:954
votes: 38


I absolutely believe throttling exists. The evidence to me is that on my daily time frame G Analytics charts, each day has its own (relative) characteristic level of traffic.

On some weeks the normal rounded plot of the days looks as if it's hit a ceiling and is literally flat across on top for the week.

I serve enough volume of pages that I can't imagine that that plot is occurring naturally.
1:52 pm on Nov 30, 2016 (gmt 0)

Preferred Member

Top Contributors Of The Month

joined:Nov 2, 2014
posts:456
votes: 191


I saw "throttling" start in my industry the fall of 2015. Both the QUANTITY and QUALITY of organic traffic have been manipulated by Google since then. The problem I think Google has is the throttling appears to impact Adwords as well, which caused many to leave Adwords entirely because of its inability to produce a reasonable and consistent ROI. Active campaigns I had running for years in Adwords suddenly were producing a negative ROI, yet the same campaigns in Bing/Yahoo remained stable with a profitable ROI.

Since I sell products, the solution to this problem was simple. I joined Amazon and used a portion of the money I once spent on Adwords to advertise in Amazon's sponsored products. This has worked out well for me, even after Amazon's 15% cut of every sale, and I don't have the headaches created by beating my head against the wall with Adwords anymore.

I don't necessarily look at throttling as being that bad for my business in the long-term. Since Google feels compelled to manipulate traffic to the degree they are, it is an indication that Google may finally be reaching their plateau not only in profits but relevance. In the next year or two I would expect to see Google's market share and profits from search slip, as other business owners like myself, get fed up with Google's games and leave. When consumers find it difficult to find products in Google, but can easily find them in Bing, Yahoo, Amazon, etc. that will help initiate a change in consumer behavior that moves them away from Google as well.
1:59 pm on Nov 30, 2016 (gmt 0)

Preferred Member

Top Contributors Of The Month

joined:Apr 1, 2016
posts: 638
votes: 183


(from someone that analyzes his stats several times days )

That is exactly the problem, you are spending to much time looking at a chart. You are seeing patterns where there are none. What do you expect to see, a chart a with big peaks? With big peaks come deep valleys.

Your site has a relative ranking (not one that you can check with some keyword tool). A ranking based on all the potential keywords that lead to your site, relative to all the other sites that exist for those keywords. That is all keywords, from high volume "keywords" to long tailed (eg: low volume) search phrases. Your niche has an average volume of searches, as well as variance in the volume (eg: normal daily fluctuations). What you see is a similar pattern that repeats every period (day, week month) based on these three factors. This may look like a ceiling but it is not.

If you want to changed your relative ranking, you need build more content, get new links and make your site better. Then as these things are discovered through naturally crawling, than you will taken on a new position. Your rankings will improve for while, as the content/links are discovered, then you will reach a new mean and fluctuate around it. And it is possible that you do these things but it has no affect, as it may not be sufficient to move up.

It seems that you are expecting only positive peaks, the only way that is possible is that your traffic will keep increasing ad-infinitum. That obviously wont happen, so what do you get? You get what you see.
3:21 pm on Nov 30, 2016 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 5+ Year Member

joined:Aug 11, 2008
posts:1341
votes: 120


We saw throttling YEARS ago, and believe the modern version to be strongly related to [mis-matched / buy-adverse / non-engaged]* traffic. This has come in two very different flavours over the years.

A warning before I continue. As has always been the problem here, lack of syntactic / analytic discipline usually derails this type discussion. For example, people do not limit the discussion to only Google organic traffic. This means things become like the Trump / not-Trump discussion (or leave/remain, if you prefer) in that people feel perfectly happy in knowing that the other side are simple cretins who just don't get it. Therefore they can be ignored and/or insulted with impunity.

As a worked example:
WidgetMan sees throttling, on undifferentiated traffic.
ThrottleSceptic says "That's not just Google Organic so your views are invalid".
WidgetMan may not know and certainly does not explain that 95% of his traffic is from Google and the remaining 5% always stays stable.
Even if he did explain, ThrottleSceptic takes the stable 5% as evidence that ALL traffic is throttled so is probably a server/hosting problem.
WidgetMan and ThrottleSceptic then ignore each other, except when baiting or insulting each other.


1) The buzz-cut
I have not really seen this since the Caffeine infrastructure was launched. It's the originally observed throttling, whereby you cannot get more than X visitors in a time period.

Mechanism could be simple or complex, but per Occam, let's go with simple. Referrer budget set by some over-arching non-keyword score. Site reaches limit, it gets dropped a few places and loses most click-thrus

2) The fuzzy line
This is really, really hard to convince other people with. Basically, your referrals are always in a tight band. An early spike means a late dip. I originally saw this on weekly traffic numbers. A really good week would be killed by a statistically improbable Friday, leaving a +/- 3% number. The fuzzy line has another significant feature that is worth noting. If your traffic is down, you get a boost. For example, I would get a "normalised" bank-holiday Monday, where traffic was close to a regular Monday, rather then the weekend norm as with our non-G traffic. Also, we would have a strong Tuesday-Friday, meaning the week view would be within normal variance. **[But see edit below]

Mechanism seems to be more complex. One method would be where site / page(s) approaching limit, gets dropped a rank or two. CTR drops, but is not eliminated, so little in the way of a smoking gun. Getting a bit tin-foil due to the processing required, theoretically Google could calculate a "glide-path" and tweak your ranking a position up or down to keep you in it. That way, there would be very little way of detecting throttling, except by outcome.

Also, the fuzzy line, in sharp contrast to the buzz-cut, seems to not apply to the whole site. I don't know if it is throttled by page or keyword or something else, but we found we could get peaks in different areas at different times.

Ok, but why...
It's been mentioned by someone else above, but might this be Google trying to be fair.

With 1,000,000+ results for any given search, a winner-takes-all approach where the sole #1 occupant takes all the traffic seems, well, stupid. Why not rotate the top spots to share the love. For a start, this makes it more likely the putative "winner" can cope with the traffic- far from a given if an update pushes a smallish site unexpectedly to #1.

If you are dispositionally opposed to the idea of Google being "fair" there is a data-collection angle. Rotating results allows Google to collect relative performance data. For example, CTR data, and bounce-derivative (ClickBack-ReClick, ClickBack-Refine, ClickBack-NewSearch; not raw bounce) behaviour for multiple sites in a controlled environment. Presumably, the relative aggregate behaviour can be used to refine the SERP, or refine user-intent, or even as some sort of ranking factor for the rotated pages. I do A/B testing and I don't have the Big Data analytical power, nor legions of doctorate-level boffins to interpret that for me. Google does.

*Avoiding the Z-word so as not to derail this thread

**ETA - I believe my data, but see for example "reversion to the mean" as to why spikes/troughs often disappear when "zoomed out"

[edited by: Shaddows at 3:42 pm (utc) on Nov 30, 2016]

3:35 pm on Nov 30, 2016 (gmt 0)

Preferred Member

Top Contributors Of The Month

joined:Apr 1, 2016
posts: 638
votes: 183


How do you know what your ceiling is?
3:44 pm on Nov 30, 2016 (gmt 0)

Preferred Member

Top Contributors Of The Month

joined:Nov 2, 2014
posts:456
votes: 191


With 1,000,000+ results for any given search, a winner-takes-all approach where the sole #1 occupant takes all the traffic seems, well, stupid. Why not rotate the top spots to share the love.

I had thought of this possibility some time ago, though it is a double edged sword. In my industry we have very few competitors. Our competition produces albeit cheaper but inferior products. By inferior I mean when they fail damage may occur that puts lives at risk (think of a cheap wheel on a car failing on the highway at 70MPH). Though our prices are higher, natural selection should boost us and prevent inferior products from ever reaching the top of the search results. I would argue we should be on the top because product quality and reviews to that effect are an indicator of our customer first approach. It has worked well for us throughout the years leading to many orders that bypass the internet entirely as we became the go to provider for the types of products we produce and sell. Among our customers are government agencies, Fortune 500 companies and yes even the major search engines.

Natural selection weeds out the garbage. And if Google is trying to be fair in my industry, by rotating sellers of similar products, they are not doing the consumer any favors by putting their safety and financial investments at risk.
3:48 pm on Nov 30, 2016 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 5+ Year Member

joined:Aug 11, 2008
posts:1341
votes: 120


How do you know what your ceiling is?
Empirically!

Seriously though, I found the buzz-cut to be compelling, as evidence goes. But the fuzzy line is, well, pretty darn weak.

What's the difference between G-throttling and market saturation? How do you know you have addressed your total market, versus hit your visitor limit?
4:00 pm on Nov 30, 2016 (gmt 0)

Preferred Member

Top Contributors Of The Month

joined:Apr 1, 2016
posts: 638
votes: 183


@glakes
Natural selection weeds out the garbage.

Not exactly. Natural selections, selects the entity that can best sustain the current ecosystem. In this case the marketplace. If customers are demanding lower prices and not placing much importance on safety and longevity, then unfortunately the "high cost, high safety, high quality" suppliers will be weeded out, and the "garbage" will rise to the top.
4:20 pm on Nov 30, 2016 (gmt 0)

New User from US 

joined:June 28, 2016
posts:11
votes: 1


nice responses / input.

I also forgot to mention my main reasoning for this . Everyday i'd have the same amount of visitors, but with different / varying keyword phrases. (although still relevant to the main keywords) But still different keywords altogether . That alone is a smoking gun.

But I run several sites of completely different categories, so I can also see what others here have said that they've never seen this type of behavior. On many of my non-adult sites , I also do not see this behavior as frequent . So it could also be that certain types of sites (or categories altogether) are thrown into "throttle" mode . But others are not. That would also be a smart technique of Google to keep everybody guessing / arguing with each other.
5:18 pm on Nov 30, 2016 (gmt 0)

Preferred Member

Top Contributors Of The Month

joined:Oct 29, 2012
posts:359
votes: 41


@NickMNS

In terms of informational site, this effect might be how some of the more popular garbage weeds out the real news. Customers demand sensational nontruths more often than not. In terms of real goods though, "repeat" customers that trusts certain brands counters this effect.

@Mikeinjersey

This kinda explains my view about how a site gets throttle by being not shown at all in the SERP for certain searches.

Throttling's appearance looks like when more content, more link, more everything gets added, but for same amount of traffic that gets rotated to the different parts of the site. Pages that suddenly becomes popular will take away from other popular pages. Resulting in a damped top, even when the additional 20% of that popular traffic peak should have popped the total for 20%. It doesn't. It becomes something like a 2~3%. The new pages target totally different users so that other page's traffic should remain, they do not and get shifted.

Market saturation occurs when you have completely dominated a niche that there is no way for you to get more people into your site. It happens when you get #1 SERP on all keywords.

The difference is pretty easy to tell.

This behavior is the easiest seen on sites that try to cover broader and expanding topics. For example: Say a tech site that covers apple products, expand into samsung. The ceiling would have existed while covering apple products, even if adding in content about samsung would theoretically increased the traffic 2 fold plus. But most likely it will settle around 20% increase.
5:32 pm on Nov 30, 2016 (gmt 0)

Preferred Member

Top Contributors Of The Month

joined:Apr 1, 2016
posts: 638
votes: 183


adding in content... ...would theoretically increased the traffic 2 fold plus.

Can you elaborate on that theory.
5:36 pm on Nov 30, 2016 (gmt 0)

Preferred Member

Top Contributors Of The Month

joined:Oct 29, 2012
posts:359
votes: 41


Just numbers as example: Existing site gets 500 visitors. Add completely new content that hits completely different user base, adds another 500 visitors. But the total is not 1000. The existing site will get some of those pre-new content visitors shaved off and shifted to the new pages.
5:51 pm on Nov 30, 2016 (gmt 0)

Preferred Member

Top Contributors Of The Month

joined:Apr 1, 2016
posts: 638
votes: 183


Le me see if I understand this correctly.

I create a site dedicated to the sale of purple hello kitty i-phone 7 cases. I manage to rank the site number 1 for many terms related to this niche. All is good in my world, say 1000 visits a day with a great cr say 5%. I now decide to expand my site to sell I-phone 7, so I create content targeting the I-phone7s. I use the same level of quality to create my pages as I did in the past. So now I am getting 2000 visits a day and I am suddenly selling 50 purple cases and 50 I-phone7s a day.

Did I understand your theory correctly?
5:57 pm on Nov 30, 2016 (gmt 0)

Preferred Member

Top Contributors Of The Month

joined:Oct 29, 2012
posts:359
votes: 41


That's my theory partially and I'll plug in your example. Let's assume the site does rank for iphone 7 and it does get 1000 visits "new" before the kitty. Now the hello kitty pages will get hit and lowered to 300 and lose those traffic.

The new immediate overall split will be 200~300 hello kitty and 1000 iphone 7. As time goes further and traffic level gets evened out. Will most likely see 500 hello kitty and 500 iphone 7 (or any other kind of splits). Resulting in the same 1000 visitors before the introduction.

That's the appearance of throttling and ceiling that I have experienced.
6:16 pm on Nov 30, 2016 (gmt 0)

Preferred Member

Top Contributors Of The Month

joined:Apr 1, 2016
posts: 638
votes: 183


To rank for a specific term such as a "purple hello kitty i-phone7 case" should be relatively easy and straight forward. Whereas ranking for a highly competitive term such as "buy i-phone7" seems nearly impossible. What make you believe that just because you rank for some random search term you would automatically rank for any other term including a highly competitive ones. Am I missing something.

Also if you have a site specific to some topic, then start adding pages about other random topics Google interpretation of the sites content will likely change, thus you could negatively impact your ranking for the specific topic.

As time goes further and traffic level gets evened out.

Yes of course, as Google's interpretation of you site adapts as a result of the new content your position for the various keywords, new and old will settle in and you converge to a new mean with a new level variation. If this is what is referred to and throttling, then yes there is "throttling". To me this is simply ranking in a complex dynamic system.
6:52 pm on Nov 30, 2016 (gmt 0)

Junior Member

10+ Year Member Top Contributors Of The Month

joined:Feb 4, 2004
posts: 148
votes: 12


I believe without a doubt, that a human editor has us filtered to not go above #11 for any term relating to our niche. We've been in business 12 years. We've been through it all, quality posts, seo etc.

This is cronyism at it's best.

In our niche, the big players have taken over. Money begets money. When you spend a few thousand dollars a month on ads, then you also get the organic poisitions, sometime two of them. It's a nice circle of wealth, created by human editors who are on the take for that placement. Without a doubt! I also believe they are doing something dishonest with the traffic that you pay for. Strange how one day everything converts, then the next no sales. Yet ranking stay the same (nothing over #11).

We are being had.

Now to convince the world to use duckduckgo.com
6:57 pm on Nov 30, 2016 (gmt 0)

Preferred Member

Top Contributors Of The Month

joined:Oct 29, 2012
posts:359
votes: 41


I do not think the ability to rank should be discussed, but rather the aftermath of achieving that rank and traffic, and what that new found traffic does to existing traffic. By default throttling assumes that you are able to achieve much higher rank / traffic if throttling does not exist.

So yes now we agree that there is some form of throttling that is a byproduct of a complex dynamic system. We used an extreme example of hello kitty and iphone. Another question is that, what happens to the 500 hello kitty visitors after losing them from our assumed site? They still search and will go where else.

The effects can become more granular after establishing that extreme cases exist and occur. Minor steps of "throttling" or "complex dynamic system" (just really a name in my opinion) occur between items of similar nature as well. Say Hello Kitty cases and snoopy cases, and other kinds of cases. Getting more traffic for a specific new product has the potential to take away traffic from existing products. The traffic level of the overall site is the ceiling that is hard to break through once a level has been established.
7:02 pm on Nov 30, 2016 (gmt 0)

Preferred Member

Top Contributors Of The Month

joined:Nov 2, 2014
posts:456
votes: 191


Not exactly. Natural selections, selects the entity that can best sustain the current ecosystem. In this case the marketplace. If customers are demanding lower prices and not placing much importance on safety and longevity, then unfortunately the "high cost, high safety, high quality" suppliers will be weeded out, and the "garbage" will rise to the top.

The point of my previous post was that natural selection does work, including the price points you mentioned, and there are reasons why some businesses should be firmly planted in the #1 position. But what happens when the consumer is not given all of the information to make an educated purchasing decision? That is what Google may be doing, if rotating traffic, and in an economy where consumers rely heavily on reviews I see this as troubling. Though disbursing buyer traffic to all businesses in an industry may be fair in shallow terms, it circumvents the natural selection process by overlooking many bad reviews, complaints and lawsuits that otherwise should put such businesses at the bottom of anyones list. And let's face it, most businesses don't brag about their products being safety hazards. Competitors often point out the flaws of other products on the market. When these competitors are dropped from a search index, valuable information that impacts the safety of consumers is suppressed.
7:48 pm on Nov 30, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member aristotle is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Aug 4, 2008
posts:3083
votes: 208


frankleeceo wrote:
Just numbers as example: Existing site gets 500 visitors. Add completely new content that hits completely different user base, adds another 500 visitors. But the total is not 1000. The existing site will get some of those pre-new content visitors shaved off and shifted to the new pages.


When you add new content, your site may become less focused.

Google might give you credit for being something of an expert in one area (your old content). But if, on the same site, you try to look like an expert in two areas (old content and new content), then Google might decide that you're not an expert in either area. Thus you don't get the big overall traffic boost that you expected.

If you want google to think that you're an expert in two different areas, then you need two different sites.
12:48 am on Dec 1, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Sept 14, 2011
posts:758
votes: 24


When you add new content, your site may become less focused.

Google might give you credit for being something of an expert in one area (your old content). But if, on the same site, you try to look like an expert in two areas (old content and new content), then Google might decide that you're not an expert in either area. Thus you don't get the big overall traffic boost that you expected.

If you want google to think that you're an expert in two different areas, then you need two different sites.


Strange that for instance Amazon don't have this problem or many other Brands. Or do you think different rules apply per website?
12:55 am on Dec 1, 2016 (gmt 0)

Preferred Member

Top Contributors Of The Month

joined:Apr 1, 2016
posts: 638
votes: 183


It is a matter or relative size. Amazon is so huge that no single page addition will have a significant impact. Whereas if your site has ten pages of content and you add 1, that is a 10% increase.
12:59 am on Dec 1, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:May 22, 2005
posts:657
votes: 20


Are search results being throttled ?

yes - next question ?
6:37 pm on Dec 15, 2016 (gmt 0)

New User from US 

joined:June 28, 2016
posts:11
votes: 1


everything is sooo throttled , it's ridiculous and not even worth optimizing your sites anymore.

they set a standard value for your site, and leave it at that. no matter how well optimized. Once you reach your set number of visitors per day, they throw ur results back for the day...and pull others forward.

Backlinks still matter a lot, but the type of backlinks are crucial. They have to be unique and not from a pattern of other webmaster's sites that have the same backlink 'pattern' as you.

Also, if you have connections to people at google , or are generally 'nice' to them...they can give your site value a minor boost as well.
This 125 message thread spans 5 pages: 125
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members