Welcome to WebmasterWorld Guest from 54.162.164.247

Forum Moderators: Robert Charlton & andy langton & goodroi

Message Too Old, No Replies

Are search results being throttled ?

     
9:44 pm on Nov 29, 2016 (gmt 0)

New User from US 

joined:June 28, 2016
posts:14
votes: 1


A weird trend i've noticed recently... (from someone that analyzes his stats several times days )

It almost seems like our positions lately are being throttled. (our as in everybody's)

No more are the days where you were #1 or #2 forever on google... it's like once google realizes you've had a certain amount of hits per day, your results get throttled back some...and others gain forward.

thoughts ?
8:51 am on Dec 16, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Feb 12, 2006
posts:2618
votes: 76


When you add new content, your site may become less focused.

If you want google to think that you're an expert in two different areas, then you need two different sites.

this is what i've experienced as well.

i was already ranking well with a smaller site so I added a sizeable new section to it, but instead of going up my traffic ended up going down (quite dramatically). there was nothing wrong with the actual content as far as i could see.

so i figured the same as you -- that google had considered me an 'expert' in one narrow area, but when i added this new section i must have 'diluted' it. so i ended up being an expert in nothing.

in the end i decided to remove all the new content to get my traffic back
9:18 am on Dec 16, 2016 (gmt 0)

Administrator from GB 

WebmasterWorld Administrator engine is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month Best Post Of The Month

joined:May 9, 2000
posts:24335
votes: 554


in the end i decided to remove all the new content to get my traffic back


londrum, can you update us to the result of that change?
9:31 am on Dec 16, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Feb 12, 2006
posts:2618
votes: 76


i got all my traffic back, but it took quite a while -- months, from what i remember

to that guy who was talking about hello kitty phones cases, i think you would see better results if you concentrated on adding content about that -- hello kitty iphone cases.
don't add stuff to do with iphones.
because you might end up with less traffic, not more
11:03 am on Dec 16, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member bwnbwn is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Oct 25, 2005
posts:3542
votes: 19


To add to this discussion this was brought at least 8 years ago and has continued to pop up in threads.
Here is what WebmasterWorld members figured out why this was happening back in 2008

At this time 2008 (little leeway on the year) a high number of search terms were being dominated by savvy SEO's. About the only way to get into the serps was adwords. We all know especially during the early years of adwords click fraud was really bad.

Long time WebmasterWorld mod.Tedster (RIP) named this term as throttling.

The below search is a good read view of the years this has been discussed.
https://www.google.com/#q=tedster+throttling+site:webmasterworld.com

Throttling was brought into the search results to allow many more websites into the organic serps. There are still a few dominating websites but nothing like it was back in the heyday.

Throttling is Googles way to direct the available search traffic per nitch to a much higher number of websites.

Organic traffic then makes a website owner more likely to purchase adwords traffic.

Think about that

---

Mod's note: the hashtag in the above url breaks in the WebmasterWorld redirect script. Either copy and paste the above url into your address bar, or search for following...
tedster throttling site:webmasterworld.com


[edited by: Robert_Charlton at 2:43 am (utc) on Dec 19, 2016]
[edit reason] Delinked link which wasn't working [/edit]

1:21 pm on Dec 16, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 1, 2016
posts: 922
votes: 230


Think about that

Lets...
What is throttling? Google limits traffic sent to your website to within some range of daily, hourly or monthly visits.
This means that one looks at ones analytics one sees a distinctive pattern, for any given range, say a day, the amount of traffic sent to your site is always in that particular range.
How is that range set? It is different for each site, based on undisclosed and undefined factors, ie: there is no way to predict what the range will be for your site, it simply is what is.
How long does this last? This goes on indefinitely, it can change periodically either up or down.

Now lets look at a site that does not suffer from throttling and holds a steady rank within the serp.
When one looks at ones analytics one sees a pattern, for any given range, say a day, the traffic on the site generally remains within that range.
How is the range set? The range is based on the ranking, but there is no way to predict what it will be, it is what it is.
How long does this last? This goes on indefinitely, it can change periodically either up or down.

These two scenarios are both very similar.

If you don't agree with the second scenario, I would be very much interested to see how you would describe a world where there is no throttling, and the site ranking in a steady state. Obviously if there is growth or decline in rankings, then there can't be any throttling.
4:18 pm on Dec 16, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 1, 2016
posts: 922
votes: 230


See link to the Google Hangout were John Muller denies Throttling search results, go to 24:50
[plus.google.com...]


His explanation brings up one other reason against throttling.
Assume results are being throttled, suddenly some event occurs and there is a sudden rush of searches for a term that your site ranks for. Since Google is throttling, this traffic does not come to you. Where does it go? To another site that is not throttled? If there is a competitor's site that is not throttled, then wouldn't that site get all the traffic always? No, that site get some traffic and your site get some other portion of the traffic, but then that would suggest that the other site is throttled too. So we are back at square one, if both sites are throttle where does the excess traffic go?

For throttling to work all sites need to be throttled.
4:43 pm on Dec 16, 2016 (gmt 0)

Preferred Member

Top Contributors Of The Month

joined:Oct 29, 2012
posts:392
votes: 43


I call it throttling, but others may not call it that. I "googled" throttle: "a device controlling the flow of fuel or power to an engine.". By definition google controls the flows of traffic doesn't it? :).

All sites are throttled in a percentage basis. Sometimes it works in your favor sometimes against you. Thus it is pointless to fight against it or prove it, other than providing the best site for users...

When some event occurs, if you are the only one that has that information. You will receive up to your alotted traffic. The excess traffic will go to the competitors that do not "yet" have that information, at a less fitted page. This is partially how and why we see some really funny sites ranking for extremely unlikely keywords.

If the competitor is slow or never provide that information, the excess traffic will flow back to you as Google crowdsource that CTR and user bounce information. If the competitor keeps up the pace and creates that page to suit the event, all the excess traffic will allow for the speedy ranking of that new event related page. Then it's business as usual the tug of war on traffic.

The excess traffic go to a lesser focused page owned by a bigger brand. Or it could go to a super targetted page owned by nobody. Either way the throttling and control of traffic allows all sites to at least get some user metrics for google to calculate and speculate.
5:28 pm on Dec 16, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member aristotle is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Aug 4, 2008
posts:3154
votes: 220


NickMNS --
There may be an upper limit (or ceiling) set by the algorithm for ALL websites, say on a daily basis. But for most sites, the ceiling is far far higher than the normal daily traffic, so the throttling never happens. Sites whose normal daily traffic is near the ceiling are the sites where frequent throttling can occur.

Every site needs to have a theoretical ceiling set for it as a final safeguard against a possible huge manipulation or even a hack in google's search results that would send enormous traffic to one site, even for irrelevant search terms.
7:20 pm on Dec 16, 2016 (gmt 0)

New User from US 

joined:June 28, 2016
posts:14
votes: 1


bwnbwn, I completely agree with Tedster's there.

I'm another one that's monitored google analytic trends for over a decade now....so i truly recognize and know it is happening.

Some sites are not affected...like the real big ones that always dominate the top 10 on important keywords.. that's why some people don't see any of this or believe it. But if you ask me, the big ones 'SHOULD' be throttled as well. (why just mess with the middle class all the time...? geez) Since we already see it occurring with mediocre sites, google should just come out and admit it outright....so that they CAN throttle the big sites as well. That's part of the reason why they don't throttle big , established sites as well..because people(webmasters) will easily see the changes when monitoring their keywords and keyword rankings daily. Their still doing everything in a 'behind closed doors' fashion.

Politics and $$ could be another reason why the big sites don't get throttled.
7:52 pm on Dec 16, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member editorialguy is a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month

joined:June 28, 2013
posts:2942
votes: 506


If you want google to think that you're an expert in two different areas, then you need two different sites.

Not in my experience. The structure that has worked extremely well for us over the years has been:

Main topic

Main topic / Major subtopic 1

Main topic / Major subtopic 2

Main topic / Major subtopic 3

And so on (all under the same domain).

We rank extremely well for several of our major subtopics, to the point where a couple of them have become tails that wag the dog.
5:36 pm on Dec 22, 2016 (gmt 0)

New User

joined:Dec 14, 2016
posts:6
votes: 1


I have theorized on this Google throttling myself. Nice to see somebody else coming to that conclusion.

As for the question: "Why would Google do it?" The simple answer is that almost all of Google's profit comes from its advertising business. If webmasters are not getting traffic from Google, they either have to buy it or do without. Google has an inherent incentive to encourage webmasters to pay for advertising.

That is why Google ranks sites then sandboxes them. My experience starting new sites is that after a few months of no traffic they will rank for a few months then be buried in the rankings. As I signed up for Google Webmaster Tools, a week or two after Google buries my site in the rankings so it gets no traffic, Google will send me an email trying to get me to pay for advertising (to replace the traffic I just lost).

A site that gets traffic from Google does not have to pay Google for advertising. Google knows this.
10:15 pm on Dec 22, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member editorialguy is a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month

joined:June 28, 2013
posts:2942
votes: 506


As for the question: "Why would Google do it?" The simple answer is that almost all of Google's profit comes from its advertising business. If webmasters are not getting traffic from Google, they either have to buy it or do without. Google has an inherent incentive to encourage webmasters to pay for advertising.

That argument might be convincing (at least to conspiracy theorists) if all or even most sites were potential buyers of advertising.
11:14 pm on Dec 22, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 1, 2016
posts: 922
votes: 230


@billzo, I receive mail from Adwords on a regular basis asking me to advertise. The thing is I receive the these emails whether or not I have launched a new site?
Is it possible that the Adwords email are unrelated to whether or not you have launched a site?

Also I have launched sites and received good traffic from the start that has steadily grown over the years, and I have launched sites that bombed from day one and never recovered. How would you explain this in term of throttling?
2:39 am on Dec 23, 2016 (gmt 0)

Preferred Member

Top Contributors Of The Month

joined:Nov 2, 2014
posts:515
votes: 217


That argument might be convincing (at least to conspiracy theorists) if all or even most sites were potential buyers of advertising.

The argument is consistent with the reality of a for profit company that has been allowed to self-regulate itself in an advertising market of which they control 50% globally.
3:39 am on Dec 23, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 1, 2016
posts: 922
votes: 230


Sorry I'm going to step back a few posts

@aristotle
Every site needs to have a theoretical ceiling set for it as a final safeguard against a possible huge manipulation or even a hack in google's search results that would send enormous traffic to one site, even for irrelevant search terms.


This makes sense, in so much as the safeguards are required, but why would does it need to be a traffic ceiling? This would suggest that it is impossible to have site go viral. My understanding is that the safeguards are in place in the from of anti-spam algo's such as for example but not limited to Penguin and Panda. I have first hand seen spikes in traffic. For example as a result of news events. Most recently during the Olympics, I saw my organic traffic double (a spike that lasted a few hours) after the outcome of one of the events. This was a complete fluke, as my site has nothing to do with the Olympics, well maybe indirectly but it is a stretch. After the spike, which only lasted a few hours the traffic returned to normal.

So clearly traffic spikes and going viral are possible.

I asked a question in my previous posts that still remains unanswered.
Assuming throttling is happening, that is the traffic patterns that we see now are the result of throttling. How would traffic patterns differ if there was no throttling and the site holds a steady ranking. (i.e.: there is no growth or decline in traffic over a medium to long time horizon (month, quarter or year)?
9:11 am on Dec 23, 2016 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Aug 11, 2008
posts:1428
votes: 150


impossible to have site go viral.
See, this is where pedantry has to be enforced. Google Organic traffic has little or nothing to do with a site "going viral". Almost by definition, viral traffic is not SERP-derived. Your organic spike isn't about going viral as much as increased interest.

Google-enforced traffic ceilings therefore are no impediment to a site going viral. Indeed, going viral might be one factor that increases your ceiling (see also: Chrome browser tracking).

There are two possible explanations for your interest spike that are consistent with throttling. One is that, given that Google is returning a bucket of sites for semantically related searches, if that semantic grouping suddenly gets 20 times normal traffic then the ceiling would have to be increased by an aggregate 20 times for the bucket of sites, though probably not linearly (i.e. some sites will get 100-times, others none). The second explanation is that you are nowhere near your traffic limit.

Assuming throttling is happening, that is the traffic patterns that we see now are the result of throttling. How would traffic patterns differ if there was no throttling and the site holds a steady ranking
While reversion to the mean is a phenomenon to note, it is extremely unlikely that traffic never clears a particular boundary, where strong starts suddenly turns into near-zero traffic when you approach your limit. Looking at the headline is unconvincing; watching the traffic peter out as you approach the limit is soul destroying.

As for the question: "Why would Google do it?" The simple answer is that almost all of Google's profit comes from its advertising business
No, that is the the "google is evil, now where's the evidence" answer. Not that making profit is inherently evil...

First of all, Google's ad revenue ultimately comes from users (no users meaning no one pays for advertising). Users come to Google because G satisfies their search requirements. Therefore, Google will not sabotage search results, as that will hurt their revenue. Secondly, throttling seems like an inefficient way to increase advertising revenue. As EG noted, most sites are not candidates to buy advertising.

As I mentioned earlier, the simple explanation for throttling is that Google sees that making it a 1-in-a-milliion lottery to top the ranking for a keyword is sub-optimal all round. Better to cycle through the near-equals to share the traffic (and test user engagement with a feedback mechanism into ranking score) than provide a algo-derived "winner" in a famine-or-feast scenario for coming #1 (or above the fold)

[edit]Edited Google ad revenue source for clarity[/edit]
2:19 pm on Dec 23, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member aristotle is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Aug 4, 2008
posts:3154
votes: 220


NickMNS --
You quoted from my earlier post, but didn't include the following part:
for most sites, the ceiling is far far higher than the normal daily traffic, so the throttling never happens.

So for most sites, big temporary traffic spikes are still allowed, since the ceiling is far far higher than normal traffic.

Also, you didn't address the reason I gave for why a ceiling is needed, namely
as a final safeguard against a possible huge manipulation or even a hack in google's search results

As Shaddows correctly pointed out, that has nothing to do with "going viral".
3:27 pm on Dec 23, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 1, 2016
posts: 922
votes: 230


LIFO @aristotle
since the ceiling is far far higher than normal traffic

If the ceiling is higher than normal traffic than how is anything being throttled? To throttle something you need to create a restriction to limit flow.

Typically in the physical world, you throttle the flow of water by introducing a valve into a pipeline. If the valve is fully open and the pipe is not fully flooded than there is no throttling. For throttling to occur the valve must cause a restriction by limiting flow. Upstream of the valve there will be a reservoir (an accumulation of water) or an overflow(somewhere for the excess water to go) and down stream there will be a flow less than what is possible if there was no restriction.

Now lets port this construct to the world of web-search. Assume there is an algo that acts as a flow control valve. That means that upstream of the algo there needs to be a reservoir where search requests can be pooled. In the absence of the reservoir there can be an overflow, one or many sites where the flow of requests can go such that they do not pool (causing latency for the users). [Side note: this partially dis-proves my statement above where I say that all sites must throttled for throttling to work]

In this scenario a throttled site would frequently reach a set maximum, a deterministic max, i.e.: 10,000 visitor per day, not a range. You could frequently see values less than the max, but the max would never be exceeded.

So based on this lets now look at how this algo would need to function, many sites rank for one search term/phrase, and there are many search terms/phrases. Lets picture each search/term leading to a site as pipe into the site. each pipe would require a flow control valve. Then there needs to be sensor at the site to count the arrival of searchers. This needs to be repeated across many sites. All the pipes need start somewhere, so that is at the level of the search request, so each request is a node where all the pipes for all the sites that serve a particular search start.

We have constructed a massive dynamic system that needs to operate in real-time with virtually no latency. The aglo would have a near infinite complexity.

But there is a simple solution, rank web-sites based on some criteria. The topped rank sites will get the most of the traffic and further a site sits from the top the less traffic it would get. If traffic increases the distribution of traffic simply follows. No dynamic systems to manage, its simple and more or less fair. If a website cheats, push them to the back of the line (Penguin or Panda) or in an extreme case remove from the line completely (manual penalty).

The outcome is nearly the same as the dynamic system, with limited complexity, and manageable at scale.

@shaddows
You are absolutely right, my traffic spike was not viral per se, but an increase in interest as you describe.

it is extremely unlikely that traffic never clears a particular boundary,

Yes obviously, the further a value is from the mean the lower the probability is that you will reach that value. Assuming a normal distribution with a mean of ten and standard deviations of 3, one would expect to see a value of or greater 13 only about (1-0.68) or 16% of the time, 16 or greater only 2.5% of the time, and finally 19 less than 0.14% of the time. Assuming these are daily number you would expect to see 19 once every two years. (assuming you mean remains stable over the period)
So one cannot use the fact that one rarely sees extreme values as proof of throttling. As mentioned above, it would be the opposite. I am not going into the details of why, as I spent far too much time on this post already, I gotta get back to work :0)
11:52 pm on Dec 23, 2016 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Aug 11, 2008
posts:1428
votes: 150


@Nick - If someone had a mean 10, SD 3 and were complaining of 19 being their throttled limit, they would be crazy.

I'm not at a PC to construct a dataset, so please excuse imprecision. But my first point would be to reject the assumption that web traffic follows a normal (i.e. symmetric) distribution.

Say 13 was a hard limit, ~40% days at 12.5 (+/- 0.25), ~35% at 7.5 (+/- 0.25), with remaining 25% occurring between 5 and 13 such as to satisfy the proposition. That is more like the throttled experience.

And again I apologise if that theoretical set cannot exist, as my resources are limited while the whiskey is abundant.
12:05 am on Dec 24, 2016 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Aug 11, 2008
posts:1428
votes: 150


Two things. Pretty sure my numbers above cannot be made to fit mean 10, sd 3. And Poisson is better than normal distribution to model traffic per unit time.
3:42 am on Dec 24, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 1, 2016
posts: 922
votes: 230


Really? You're pulling out the good'ol Poisson distribution...

I chose a normal distribution because it is easily understood by many people and kinda fits anything, if your not being too rigorous.

The actual shape of the distribution doesn't really matter, my point is that as a value deviates from the mean the probability of seeing this values is greatly diminished. This holds true for Poisson distribution.

In my normal example, I wasn't making any claim of there being a ceiling. I was countering what you were saying about clearing boundaries. To be clear, the boundary you are referring to is inferred that is the only way the webmaster has of knowing of its existence is by inferring it from the data. So if one see many values within a certain range (say at 1 or 1.5 std deviations from the mean then one may infer that this is a the ceiling. But it is not, it simply the values one should expect to see for the ongoing process. The fact that one hasn't seen the extreme value in the data does mean that these extreme value cannot occur, they simply haven't and may never occur.

Proving the existence of a ceiling would require that the data has disproportionately large occurrence of the same value (or possibly a narrow range of values) and that that value is greater than the mean and close to the mean, and that it is never surpassed.
12:51 pm on Dec 24, 2016 (gmt 0)

Preferred Member

Top Contributors Of The Month

joined:Sept 12, 2014
posts:378
votes: 65


There would be no reservoir because there are an unlimited number of valves. As one valve closes another one opens. My yard irrigation system can handle this so i am sure an algo can too. The valves might trigger by flow amount, preset time, randomly or by another algorithm.
If the valves are not applied to all niches, and we don't know what each others sites are doing, we can not compare each others sites to conclude whether throttling exists or not as it might only exist in certain niches. ie, it might only be applied to certain niche ecomms/info sites.
1:28 pm on Dec 24, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Dec 29, 2001
posts:1121
votes: 33


That argument might be convincing (at least to conspiracy theorists) if all or even most sites were potential buyers of advertising.

Iím no conspiracy theorist as I tend to believe that the simplest explanation as it relates to human nature is what is actually going on.

Google is acting in the interest of Google and they are a business that controls a large percentage of search traffic.

So, yes there is improved profit to be had by rotating search results, manipulating Advertising costs, advertising delivery (AdSense), and other methods of profit shaping.
2:35 pm on Dec 24, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 1, 2016
posts: 922
votes: 230


@toidi correct there does not need to be a reservoir. The unlimited number of valves is not the reason. As I explained above, one can avoid a reservoir if one has an overflow. In this case a single site or a few sites that could handle a larger flow users.
Your irrigation system must have a reservoir, is not a good example fro several reasons:
1- It is a pull system, that is when it needs water, it opens a shut-off valve and takes it. The web is a push system, the shutoff valve is always open, when a user comes to the system its pushed to the website. As a webmaster you have no way of getting users to use your website. (yes, you can buy traffic but fundamentally, the traffic is still pushed to your site, you just get more of it)
2- There is a reservoir, it is the city's water main. When the shutoff valve is closed the water continues to flow in the main or it accumulates in a city reservoir.
3- The rate of flow of each of your sprinkler nozzles is not regulated in real-time. The flow-rate is adjusted manually to a fixed position.
4- The flow rate of water from the city's main is for all intents and purposes constant. On the web flow rate of users is random and users cannot be discarded.
5- Your sprinkler system has 10, 20 maybe a 100 nozzle's if you live on a big estate (as I am sure you do!). So a valve on each nozzle is border-line manageable. With billions of sites and says 100 of search terms per site that would be in the order 1B to the power 100 flow control nodes. Just calculating that number would break my computer.

As you can see, throttling the flow of users in a manner similar to a sprinkler system would not be possible.
2:27 pm on Dec 25, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Feb 3, 2014
posts:977
votes: 202


Traffic shaping, as Tedster used to call it has been ongoing since 2010. Maybe this recent rash isn't Google at all...considering that SERPs are the same or better yet all traffic has disappeared in the last week or so. Perhaps it's the big bad "cyber attack" that the outgoing administration has been blustering about initiating on the bad guys of the world. Bank sites are winning, media outlets losing, very interesting. As usual we're just collateral damage.
7:21 pm on Dec 25, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member essex_boy is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 19, 2003
posts:3187
votes: 4


Also, if you have connections to people at google , or are generally 'nice' to them...they can give your site value a minor boost as well.
- doubtful,
7:54 pm on Dec 25, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member aristotle is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Aug 4, 2008
posts:3154
votes: 220


So the websites in the same niche are like a bunch of hogs feeding at the same trough. The pipes from google pour the pig slop {search traffic] into the trough, and the valves always stay fully open to let the slop pour out. The big fat hogs get most of the food [traffic], and the skinny hogs get just enough to keep them alive. The supply of food varies due to random statistical fluctuations.

Some hogs think that the food is being throttled, but others believe that it is being rationed.

Some of the food is made from the corpses of zombies, which doesn't have much nourishment, and is worse than no food at all.
8:30 pm on Dec 25, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 1, 2016
posts: 922
votes: 230


@aristotle I like that analogy, with one slight nuance. The big pigs are the pigs that find the food using the most effective means, maybe they were big to start and could block others, or they colluded with other pigs. The skinny ones have figured out how to stay alive, but can't seem to get enough food to grow big. Finally the others, couldn't even reach the trough, so they died.

It could be that it is the big pigs that are causing the throttling, possibly without realizing it. They let some food fall from the trough and the small pigs can pick it up, and it is enough for him to survive. But when food is scarce, the big pig become more careful not to spill, thus leaving the small pig wondering why his food isn't coming? This would look a lot like throttling to a small pig.
10:25 pm on Dec 25, 2016 (gmt 0)

Preferred Member

Top Contributors Of The Month

joined:Oct 29, 2012
posts:392
votes: 43


Pigs eat whatever gets fed to them.

The amount of food is more or less predetermined depending on the pig.

Sometimes the throttle favors big pigs, sometimes it favors red pigs, sometimes it favors a smaller pigs.

Bigger pigs get more food, smaller pigs get less.

Sometimes smaller pigs can prove themselves to the owner so that it gets more food. But most of the time big pigs tell the owners that it needs certain amount and type of food, so that the owner prioritize and feed to the big pigs first, since they're more important than the small piggies.

Each pig is different, so the food is fed accordingly.

Sometimes there are two pigs that are so alike, the owner throttles the food to just 1 and starve the other to death.

Now, the big pigs wonder why it can't get all the food in the world? The owner wants to keep the little piggies alive so that consumers have slightly more choices if they wish, but most of the time the big generic pig is shoved down general public's throat.

When the food is scarce, depending on the owner's mood. If it favors the smaller pig, the bigger pig will really starve and take a massive hit. But if the owner still favors the big pig, there will be nothing left for the small piglet.

But at the end, it really depends on what the people want to eat, as well as what the owner things people want to eat. The thing is...the owner is now letting computer software / algorithm determine what people want to eat, based on what people are ordering. The owner will only step in when people are ending up with toxic pigs.
4:14 am on Dec 26, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Feb 3, 2014
posts:977
votes: 202


Are we discussing pig farming or traffic shaping now?
This 125 message thread spans 5 pages: 125