Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

No growth after June 2019 update

         

hossiti

3:32 pm on Sep 7, 2021 (gmt 0)



Hi every one,
I have a strange problem with my website’s google traffic, Let me describe first:
Our site is a music site (No DMCA, copyright and …) in which we introduce artists and bands and their albums by genre and moods.
Most of my visitors are from google (more than 92%)
I did not do any black seo or suspicious activity
My sites lacks in backlinks there are maybe 10 to 12 good domains (containing 5 wikipedia links)
All the pages are fully optimized and seo friendly and pass google core web vitals with great score (more than 90) for both mobile and desktop
The site is not content based and there are many short contents and a few long contents (but all the content on the site are exclusive)
In the last 2.5 years site updated on daily routine with 1 to 6 new pages
New pages are get indexed on google between 2hrs to 24hrs
There are keywords that we appear first on top of some strong brand sites with more than 10 yrs of activity
Most of crap backlinks are removed using disavow tool

Now, the problem is:
Before June 2019 google update we were just ranking up and were growing day by day, after that giant update the rising stopped and we just felt from 2300 sessions a day to 1300and after that to today we just go up and down (up to near 3000 and down to near 1500) and mostly get maximum of 2000 sessions a day, We ranked good at many keywords and users are visiting us with many different searches but it seems there is a limit for people to get us on google and that’s something around 2000 sessions.
I observed that if we get 1950 sessions until 11pm and we get average of 120 sessions on late night hours, there will be only 50 new sessions from 11pm to 12pm, and I believe that’s because of the limit. On the other hand if we get 1800 until 11pm then there will be around 200 new sessions on the next hour !
On the last two months we got some new backlinks but nothing changed. Our competitors with less domain age, less backlinks, lower design and seo standards, without any text content (just album name, artist name, genre and tracklist) grow far faster and get 10 times more visitors (according to alexa)

I don’t know if there is something I did wrong or something I didn’t do, I don’t know, it’s just annoying that with all the heavy work we do and stick to all rules and make a really standard and user friendly website and for more than a year, nothing ! On the other hand other websites with spam data just grow and get visitors extremely more than us. its a screenshot of my analytics sessons report from late 2018 to today :
https://i.ibb.co/L1C9hDs/ga.jpg [ibb.co]

I’m appreciate for some help or advice, something that I just can’t find by searching the web and can save my life! Something that’s in front of me but I just can’t see !

Sorry for my English, and thanks for your time

jediviper

12:45 pm on Sep 8, 2021 (gmt 0)

5+ Year Member Top Contributors Of The Month



How often to u add new content?
Did u bother adding any new links in the last years?

hossiti

1:06 pm on Sep 8, 2021 (gmt 0)



1. We add new artists and albums every day, all artists have text content but some of albums have a brief introduction
2. In the last year, we just got a few wikipedia links and 2 or 3 links from personal blogs who linked us

frankleeceo

1:56 pm on Sep 8, 2021 (gmt 0)

10+ Year Member Top Contributors Of The Month



I personally just call this type of observation - traffic throttling. Others call it differently or outright refuse to see it and believe that this data is fake. Simply put, your site metrics have hit a ceiling of traffic that Google allows you to have.

I observed this phenomenon as early as 2013~2014? The update wasn't the cause of your troubles, it's baked into the algo. Many different factors cause it. The bottom line is that it's time to do something different to break out of it. Buy competitors, toss millions into ads, build links, build more sites, etc.

The more content you build, the more traffic this takes away from your old content, so your net traffic remains the same.

This is a crucial point for you - become a brand (start investing money into ads for more brand awareness, email marketing, SMS marketing, social, whole shebang), or join the spammers. If you continue what you do right now, you are relying on luck and will most likely achieve no results..

NickMNS

2:54 pm on Sep 8, 2021 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I personally just call this type of observation - traffic throttling. Others call it differently or outright refuse to see it and believe that this data is fake.

Traffic Throttling is not a thing it is a fallacy. The idea of throttling is a form of "Gambler's Fallacy", and this demonstrated really well by the original post.
Now, the problem is:
Before June 2019 google update we were just ranking up and were growing day by day,

It supposes that the growth in early 2019 should continue indefinitely. Why should it? The past doesn't predict the future. The ranking improved to a certain point and then it stopped.

What one is observing in the data is a steady ranking, for any given rank with the SERPs one can expect a given amount of traffic that will remain within a range. The range is the result of normal traffic fluctuations (variation in the number of searchers going to Google) and is expected. If you want to call that throttling, then fine, but there is no knob being adjusted on a continuous basis. If this pattern, so called "throttling", is not normal then what would you describe as "normal traffic". Is continuous traffic growth normal? I strongly suggest that you read link to the wikipedia page on the Gambler's Fallacy below, specifically the section about perceived bias in coin tosses.

One can also observe from the graph posted, that ranking has changed over time, mid 2019 (likely May), end of 2019 (likely December), mid 2020 and so on. All on dates that appear to coincide with Google updates, as one would expect. The positive takeaway from the data, is that these updates have had limited impact on traffic suggesting that the website is viewed positively by Google.

To the OP you appear to be doing a great job, keep it up. It is very frustrating to reach plateau. My guess, without knowing any specifics, is that you have reached a point where you lead your competitors that are similar in size and scale to you but the next step up the competitive ladder means that now you face now much bigger and established competitors. This battle is to be fought outside of Google.

You are fortunate that you are established in the music industry niche with a sizable audience. Your industry is currently undergoing some very exciting innovations that you may be in a position to capitalize on and use to surpass large more established legacy players. Or you can spend your time her discussing "throttling" and other border line Google conspiracy theories. The choice is yours.


[en.wikipedia.org...]

aristotle

12:59 am on Sep 9, 2021 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Perhaps this site doesn't have enough trust and authority in the view of google's algorithm. Backlinks from wikipedia and a few obscure blogs probably isn't enough. There is an expected correlation between the total traffic and the number of naturally-acquired backlinks. A small number of backlinks isn't consistent with high levels of traffic.

frankleeceo

2:19 am on Sep 9, 2021 (gmt 0)

10+ Year Member Top Contributors Of The Month



Nick my man, straight from your wiki article:

"The gambler's fallacy is a deep-seated cognitive bias and can be very hard to overcome. "

According to you, "all organic traffic plateaus" are naturally occurring events without third-party interference. Try to think about what you are saying here and the role that Google plays here. And in this case, the plateau lasted, oh I don't know, two years despite thousands more content, 10's of thousands of new songs, expanding music listeners, expanding query volume, expanding youtube views, expanding google profit?

When all else fails, there's always magic.

[seroundtable.com...]

You are right, I shouldn't call it traffic throttling. From now on, I will simply call it magic.

Hey OP, it's magic that your traffic stayed the same for 2 years. :)

Why? Magic signals that's why!

NickMNS

2:48 am on Sep 9, 2021 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Frank my man, I asked a simple, but to be fair it was buried in my long post. So here it is:
If this pattern, so called "throttling", is not normal then what would you describe as "normal traffic"?


But to respond to your remarks:
According to you, "all organic traffic plateaus" are naturally occurring events without third-party interference.

No, it is certainly possible that a website could have continuous growth, But you could also see, after a period of growth, traffic could decline. Or one could see any combination of growth, decline and plateau.

...think about ... the role that Google plays here.

Let me over simplify, assume there is website that gets it's traffic from only one "keyword". Google returns a search result for that keyword and ranks the website, depending on your rank one can expect a relatively fixed share of the traffic rank 1 gets 60%, 2 gets 20%, 10% and so on. So your website ranks number 2, 1000 people search each day on average and on average you get 20% of the traffic, then you can expect 200 users per day on average. So yes that can be seen as throttling, but really I prefer to call it what it is "ranking". Obviously it is far more complex than this, with thousands of keywords, countless geolocations, different devices etc... But given enough traffic and relatively long time span (week or month) one should still see a statistical pattern emerge. Google is kind enough to provide you with stats, if you use GSC. You can see the number of impressions and average position and that correlates directly to your traffic. Small changes in position can have dramatic changes in traffic, but in my experience, in my niche, my average position (rank) tends to stable over time and so my traffic is as well.

frankleeceo

2:56 am on Sep 9, 2021 (gmt 0)

10+ Year Member Top Contributors Of The Month



Nick,

I understand your logic about averages and all keywords will stabilize towards a historical mean. There are some niches where this simply doesn't apply in my opinion.

Have you participated in a niche with constantly expanding (and sometimes with great velocity). I am referring to entertainment, songs, games, or movies niche.

NickMNS

3:02 am on Sep 9, 2021 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Have you participated in a niche with constantly expanding (and sometimes with great velocity).

No, my niche tends to be steady.

frankleeceo

3:08 am on Sep 9, 2021 (gmt 0)

10+ Year Member Top Contributors Of The Month



Yea Nick,

I think that's why you never experienced data that I experienced firsthand. I have been in the video game guide niche for a long time. A lot of the things about historical averages do not apply to this niche. Keywords/traffic expand and contract rapidly due to underlying audiences and popularity. And this is how I studied and observed traffic control firsthand by noting troubling discrepancies.

When something new releases, new articles absorb all the new traffic. And during this time, very high ranking articles from the past will suddenly be replaced by weaker domains. When the new wave passes, the old articles' traffic and rank return.

This results in a smoother traffic curve over time, instead of the spikes created by sudden popularities or interests.

The reverse is true, the valleys created by ebbing of interests, is smoothed by return of ranking by previous articles that "lost" for that brief period of time. So both spikes and valleys are smoothed over a certain average around the mean - achieved by magic signals changing ranks over time that assign a specific volume of traffic.

I did a lot of personal tests involving moving content, merging, and splitting domains, and monitor their effects on overall traffic and keywords relationship.

NickMNS

3:39 am on Sep 9, 2021 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I have actually experienced this very thing but from the opposite side. My website is based on long tail keywords, I get very few visits to any one page but many individual visits to a wide variety of pages. A page may get a view once, and then not again for weeks or month or even years. But, every so often an event occurs, typically a breaking news event that has some relevance to a page on my site. Given that the term is a true "long-tail" term there is almost no content online for it and any content is likely not from a trustworthy source. So that one page will immediately get a rush of traffic the amounts to thousands of page views in mater of hours. But, given a bit of time, as journalist begin their work researching and publishing the stories related to the event and then provide more relevant perspective about the term and how it pertains it, traffic begins to taper and within a day or two that pages goes back to getting no traffic just like before.

Essentially what happens is that Google has no data on the search term but still needs to provide a result, so they show what they have, but as "high" authority websites begin to provide content and Google collects user data for that term, they are in a better position to rank things more "traditionally" and so the SERPs tend to revert back.

Google is fundamentally stupid, throw enough poop at it and eventually something will stick. This is in a nutshell the "spam" strategy. If Google doesn't have any data on a search term, they will return the best fit they have. This applies to new terms, or long-tail terms. In the case of breaking news stories, they are very quick to sort things out. As for new video games, probably a little slower. For any random term, they may never sort it out continue to show spam for weeks.

frankleeceo

3:44 am on Sep 9, 2021 (gmt 0)

10+ Year Member Top Contributors Of The Month



During those spikes that you saw, if it lasts more than a few hours. Do you see incremental X+X traffic? Or do your other pages lose search traffic during that time?

NickMNS

4:04 am on Sep 9, 2021 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Do you see incremental X+X traffic?

Yes, other page get traffic like usual.

frankleeceo

4:08 am on Sep 9, 2021 (gmt 0)

10+ Year Member Top Contributors Of The Month



Thanks for checking. And one last question :), what's your daily average visitor for the domain? And during the spike what's the spike increase usually?

hossiti

8:37 am on Sep 9, 2021 (gmt 0)



First of all, thanks for your response to the topic, I appreciate that.

What I understood is that there may be a throttling or not. for me I should say I believe there is a kind of throttling applying to my traffic, Let me explain another example in my site :
Our site is ranked in many keywords, sometimes I notice that a keyword suddenly grows and goes from around 0.1% inbound traffic to 7% (according to analytics landing pages report), as we have a shop in site we prefer keywords lead to purchase, so after we saw some keywords went up and our income dropped I just removed that page (in search console absolutely), so I expected a 7% drop in google traffic, but that didn't happen, that 7% goes for other keywords and sessions stick to that around 2000 limit ! After a while I decided to get back the url so cancelled the removal request, for 2 or 3 days I get the extra traffic for example 200 sessions a day and then, Everything is going back to normal, traffic drops again to around 2000 and that keyword now has around 2% of inbound traffic !

If there is a particular amount of search in a niche (and yes it is, except something trending or ...) why something like this should happen, I get 200 sessions from keyword for example "jazz music", if I remove it I should lose 200 visits and when I cancel the removal I should get back my visitors (or maybe less), So, I think there is some thing.

Hey OP, it's magic that your traffic stayed the same for 2 years. :)


Thats really really painful this magic ! google decides how much we deserve to earn and how our lives should go on. but their magic stick goes another way for spammers and they just grow and make more and more money !

So your website ranks number 2, 1000 people search each day on average and on average you get 20% of the traffic, then you can expect 200 users per day on average


Unfortunately as I said in my example which I tried for more than 10 keywords, google doesn't stick to this rules and does it's own strange things !

after all If you have exprience in how to overcome this thing and move forward or there is something you did and helped to push upward that limit just let us know, It will save lifes really, thanks again :)

robzilla

9:58 am on Sep 9, 2021 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Throttling is a myth. It may make sense to you as an explanation (interpretation) of what you're seeing, but it doesn't make any sense from a search engine's perspective. If it's a good result, there's no point to withholding it.




[edited by: not2easy at 12:02 pm (utc) on Sep 9, 2021]
[edit reason] typo/user request [/edit]

NickMNS

1:27 pm on Sep 9, 2021 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



so after we saw some keywords went up and our income dropped I just removed that page

This is ridiculous, and is the most likely source of your problems.

Before I start let say Google's reporting GSC is crap and must be used with great caution. Note that as bad as GSC is, everything else is worse, specially 3rd party paid/free tools.

Number 1. Keywords are dead
The concept of a "keyword" is for all intents and purposes dead, this isn't the web of 1995. Just because a word appears on one page your website doesn't mean that Google will only send traffic to that page for that word. Moreover the same word or group of words could be used by users with different intentions and the reporting has no clear way of showing you that distinction.

Our site is ranked in many keywords,

GSC also only shows you a subset of "keywords" about 10% and that subset is not a random sample. The keywords are shown such that you see the largest diversity of terms. What this means is that terms used frequently are under counted and terms used less frequently are over represented.

Number 2. Removing pages.
To be honest I'm not sure what you mean exactly, but whatever you are doing, be it deleting or no-indexing or whatever, it does not have an immediate impact on Google's index and ranking algorithm. Google can take weeks or month to see these changes and in the interim it is sending traffic to pages that aren't there, or are likely not appearing in search as they should. This sends mix signals to your users and to Google.

After a while I decided to get back the url so cancelled the removal request, for 2 or 3 days

So if removing the page wasn't bad enough, you then undo it. Now Google is really has no clue what to do with these pages. If your lucky Google's algo will realize that there is some confusion and simply ignore these actions, but in a worse case Google will see this activity for what it is "manipulation" and then penalize you for it.

Number 3. Optimizing on a metric.
If you have a "keyword" that brings 100 users that convert at a rate of 10%, then you have sales of 10$ (assume 1$ per user), if then you have a second keyword that also brings 100 users but they convert at 1% so a total sale 1$. Your logic says that you should forgo the 1$ of sales because it is not as good as the 10$ sales. So instead of earning 11$ total you prefer to earn 10$ and not only that, you are putting effort and resource to ensure that the additional 1$ of sales does come to you. I'm sorry but this is madness.

If some page does not convert as well as other and you know which pages it is, then focus on optimizing that page to maximize revenue. Add new features, links to redirect users to other parts of the site and so on. Don't block that traffic.

frankleeceo

1:37 pm on Sep 9, 2021 (gmt 0)

10+ Year Member Top Contributors Of The Month



@robzilla,

So let's reframe your way of your thinking. Let's think from search engines POV. If you have an array of good results, and an array of good websites. How do you determine the percentage of traffic that the sites should get for that range of queries?

Further, if new sites start rank or unrank for certain queries, how do you push forward that discovery phrase. Google potentially sends test traffic to crunch the numbers. How does the system determine how much to send? What percentage to send? And sending traffic in a way that doesn't impact the good sites negatively?

The ability to assign and withhold traffic almost has to be baked into the system for any of above features to work properly. Without being able to analyze and sequence traffic, you cannot send the right traffic, for the right queries, at the right time.

"If it's a good result, there's no point to withholding it", - you are absolutely right in a broad sense, but there's a point to personalize and continuously AB test to see if there are eve better sites to match the query. Or in a more sinister sense, a point to segment the high buying intent traffic in order to sell ads at higher price point.

frankleeceo

1:47 pm on Sep 9, 2021 (gmt 0)

10+ Year Member Top Contributors Of The Month



@NickNMS

Have you tried to remove or split large parts of your websites before?

hossiti

2:05 pm on Sep 9, 2021 (gmt 0)



This is ridiculous, and is the most likely source of your problems.


I should make it clear, probabely I used some wrong words to describe the situation (sorry again for my English).
When I said "Keywords" I didnt mean it really (I know thats dead). let me explain another way :
I have a page like [mysite.com...] , Using google analytics I found out in a time priod visits to this page as a "landing page" increases extremely -as my visits are mostly from Google then I assumed that I just ranked in some phrases (not keyword) that leads to this page -, and there is a high bounce rate and exit rate on this page and the script I run on my site shows that people who landed on this page rarely become member or purchase

So I just used Search Console's own Removal tool for temporarly remove the url (the page is still there) so it will no more shown to searchers until I cancel it. After I used that tool the page dropped as a landing page but my overall traffic didnt change, if there was 200 visitors who came from google and landed on that page, now it became 0. but other pages on my site take the place, for example I get more visitors landed on /genre/rock or /genre/hiphop and so ... and the increase is near 200.
And when I just cancelled that Removal and the page came back on results it gained for example 100 landing visits from google and there was a decrease of 100 in other pages so my inbound traffic from google kept constant, I got 2000 before removal, after removal and even after removal cancelation.

frankleeceo

2:11 pm on Sep 9, 2021 (gmt 0)

10+ Year Member Top Contributors Of The Month



@hossiti

Seeing what you are saying is like seeing myself a few years back. Great that you are doing lots of experiments and tests to learn on your own. Question everything is how you learn. Here's some of my advice:

1. Stop what you are doing with this lol.
Instead of removing pages and re-adding them, that's really bad and do send mixed signals. There is a huge chance you will never get the traffic back.

2. Site Segmentation / (but do have downside)
Potentially, split your site in a way that makes sense that tailor to specific vertical / audience, and make the sites really great for that niche. I think there are lots of creative ways to break a site. There's a huge but to it that is you will sacrifice your ability to build long term brand with short term gain. Like

3. Focus on building brand
Find your value preposition, make yourself stand out. It's done in a variety of ways, google search how to do it or think. Chances are if you do something awesome, competitors will copy you, and you lose your edge over time. So building brand awareness and a loyal visitor base are super important. 2000 users per day are nothing compared to your potential, but like Nick mention, competition is tough, your sites will most likely die sadly. So you need to build business connections as well, the whole shebang.

4. Shift your focus to visitors
Create the best site, best UX, best value. Focus less on keywords. Keywords are not dead and have its place, but for your case, creating unique value seems to be your main point to get to the next level.

5. Shift focus to traffic
Keywords / ranking are dead in a sense that traffic volume is a much better indicator of your success. Shift focus to what pages bring in the most traffic, and optimize pages to help those users.

6. Find a part time job (just life advice)
At this time and age, relying on Google only is suicide. You and your team should consider getting part time jobs as side income. Google can half your traffic overnight for whatever reason they like. Also don't party if they 200%, 300% your site overnight. It may be short lived.

7. Traffic throttling
I don't understand why so many people get triggered by this term. This is really in my mind just a part of the system. Sites are throttled to participate in "query" auctions of various audience and geo sizes in a sandbox at any given time. When you gain enough "signals" in terms of user signals, linking signal, social signal, your site is allowed to participate in bigger pool of queries and rankings. Focusing on this aspect isn't productive, move on. Again, shift your focus to building better sites will provide better and more profitable results.

Good luck.

[edited by: frankleeceo at 2:25 pm (utc) on Sep 9, 2021]

hossiti

2:11 pm on Sep 9, 2021 (gmt 0)



Throttling is a myth. It may make sense to you as an explanation (interpretation) of what you're seeing, but it doesn't make any sense from a search engine's perspective. If it's a good result, there's no point to withholding it.


thanks for your responce, I dont know if it's true or just a myth as you say but my exprience shows something strange happenning there

hossiti

4:46 pm on Sep 9, 2021 (gmt 0)



@frankleeceo

Thanks a lot for your time and wise advices, as you said we should do more on branding, we did a complete UI upgrade this year based on user bahaviours that led to lower bounce rate and more page visits and session duration, also we made the site as fast as possible, now it's time to do some advertising and social media activity and make better intraction with users and this stuff I think, It seems there isn't any guaranteed action and google's behaviour is not predictable at all.

NickMNS

1:41 am on Sep 10, 2021 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@Frank
You still haven't answered my question as to what does normal traffic look like. Without answering that question you can claim that any pattern looks like it is being "throttled".

The ability to assign and withhold traffic almost has to be baked into the system for any of above features to work properly.

It could be argued that by simply ranking websites, Google is "assigning" traffic. But in a strict sense, that isn't really true because the user always has the choice to visit any of the links presented on the page or none at all.

But withholding traffic, how do you suppose that works. A search is entered, and Google says "No! no result for you!" (think soup Nz from Seinfeld). Obviously not, so what it redirects traffic to some other website. Why? Because of buying intent, geo-location, or some other factor. Yes, but that is exactly what Google does it provides a ranking based on set of factors (200+ apparently!). Is that throttling, well I suppose you can call it that, filtering, sorting, segregating or simply ranking. In the end the discussion is only about semantics. Throttling suggests that there is knob that is being turned and a gate is being opened or closed, and simply is not the case. Does it look like that might be happening sure, at times does feel like the throttle has been open to full force, sure. But it is simply the ranking algo doing it's work. is it just? Is it fair? Probably not, but that is the game you play when you depend on Google for traffic.

frankleeceo

2:30 am on Sep 10, 2021 (gmt 0)

10+ Year Member Top Contributors Of The Month



@Nick

I'll try to hit all the statements that you make.

Your main theory is that traffic growth will always reach plateau "naturally" at 1 point. I completely agree with that. But no traffic ever reaches that plateau "naturally". We reach traffic plateaus within throttled sandbox traffic way earlier.

Visitors are puppets, they have no choice. It's an illusion :). I would think a marketer you know that. 80% of them click on top 10 results, and less than 10% go to page 2. (I am just throwing this out, too lazy to check the real numbers). And less than 1% go to page 3. And answer boxes killed direct site visits. We all know this, so visitors having control is nonsense.

Normal traffic in a strict sense for all of us as you expand content and keywords share, should be consistently growing. (until the point you cannibalize your own share, or competitor's market share) If you have better sites, better content, you should continuously erode away poor sites. This is not the case. Traffic plateau is reached far earlier by design, at this time, it doesn't matter if you expand keywords, expand content, your organic traffic will limit to the plateau. As you built additional content, you erode your previous traffic share (cannibalize completely different keywords sets/content) until major updates are calculated from all the signals that "lift" your throttling state, call this algo update, core updates, link graph recalculation, whatever, semantics like you said.

Small sites get throttled, big sites get throttled. Traffic is assigned and flowed to different parts of the web based on "magic".

"Throttling suggests that there is knob that is being turned and a gate is being opened or closed, and simply is not the case. " And this statement is false if you re-read what OP is saying and really think about. In OP and my case a few years back, we turned off the knob intentionally on our end, and Google "turned on" the knob else where to flow same amount of traffic to site, with completely different keywords.

For every one that accepts and observes throttling, we all test and see a gate is being opened or closed one way or another. If you quickly expand the content of a site, you will see throttling fairly quickly.

During growth and expansion stage, as you expand your keywords / content, your traffic naturally goes up. Like your experience, you have many single page views, and occasionally you get spikes which is incremental traffic. It's X+X. That makes sense. So with that spirit, if you build a million pages all targeting different keywords, should you receive that incremental traffic? Do you really reach the plateau that early?

But what if you rank for keywords that generate certain amount of traffic, and you get additional traffic for new keywords, it's never X+X. it's X-A+X+A. We are all fighting for throttled traffic within containers. Sometimes you break out, sometimes you do not. OP stayed in this range for 2 years, despite the underlying queries expanded oh I don't know, +50% over 3 years? You think OP never made new content during that two years, or his competitors made the exactly the same amount of content/improvements to beautifully offset his gains?

----
Withhold traffic is exactly as you said. It's fair and it's not fair. The key for me is to recognize throttling exists, and work my business around it. I know what it looks like, and I know what to do to break it or live with it.

You might need to test some data yourself, do crazy stuff to break your sites. My first experienced throttling was about 1 million monthly users for 1 single site for 2 years or so. To break out of it I split my sites just to really mess myself up, I cut it up in half, I killed content that generate 30% of traffic, I did all sorts of crazy stuff. I got double traffic with exactly the same content within 2 weeks on two different domains. I didn't need any more convincing after this. One can argue the site gave poor signals whatever, but I am just calling it "magic" at this stage. It's magical really.
------
I remember discussing this with you a few years back, it is indeed semantics, it's either "algo", "throttling", "magic", "myth", "ranks", "data", "impression share", "filter" whatever lol. It is a Google game, and traffic is indeed controlled, it doesn't matter how you name it. I learned to live with it and make a living navigating around it. In the end, you recognize the traffic is indeed controlled, and that's good enough for me :).

manu14

3:41 am on Sep 10, 2021 (gmt 0)

5+ Year Member Top Contributors Of The Month



@frankleeceo
What you wrote is true - throttling is real. However new content I add the traffic remains the same - sometimes direct traffic increases, but then organic traffic decreases to match the earlier level. then organic traffic increases and direct traffic comes down.

What is the best way to get out of this ? Any tips to break free and increase the organic traffic?

NickMNS

4:29 am on Sep 10, 2021 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Ok so there are a few issue that your are not taking into account, and they are subtle but real.

1- GSC keyword data is garbage (to put it nicely), and cannot be relied upon for anything. (3rd party data is worse).
In OP and my case a few years back, we turned off the knob intentionally on our end, and Google "turned on" the knob else where to flow same amount of traffic to site, with completely different keywords.

I'm not doubting what you are saying but with the state of the keyword data there is no way to make such a claim. The reason, as I mentioned above, is that Google does not show you all the keywords, it shows a small sample and that sample is not statistically representative of all keywords. Google tries to show as many different keywords as possible and when shown once continues to provide data for those keywords whether or not they remain relevant and it doesn't show the keywords that are most relevant.

When Apple rolled out iOS14 they included a bug in the Google search app, such that when a user is logged in their searches appear as "Direct Traffic" but the app adds query strings to URL and that gets reported in Google Analytics and in your server logs. One of the parameters included in query string is "as_q" which is the keyword entered into Google search. On my site I get about 1% of my traffic from this source, it is not a lot, but if span a week or month you can get a large enough sample. The beauty of the data from Apple is that it can be considered a random sample and should be statistically representative of all traffic. So from that data one can make inferences. You can then compare those results to the search terms shown to you by Google. Google's data is biased. It isn't nefarious, from what I can tell they try to provide diversity, but that messes all their stats rendering them completely useless.

So yes remove a page from search, and Google will show a different keyword, but that doesn't mean that your actual traffic is coming from that keyword. What you are seeing is an artifact of Google's crappy reporting.

2- The relationship between published content and traffic is not linear it is diminishing.
So with that spirit, if you build a million pages all targeting different keywords, should you receive that incremental traffic?

No! I have tested this and it is absolutely false. And it becomes "more false" with scale. (ok not really, because more false is not a thing). My main website, has tens of millions of pages of unique content, and I have worked on projects with even more (hundreds of millions). I don't know how to explains this simply because it is complex and it has to do with the distribution of the frequency of searches and the "long-tail" of the distribution. The distribution is exponential, bars that start off (left on the graph) are very high on the graph and as you move right the bars quickly come down to just a small whisker. These whispy bars are many and stretch far to the right (the long tail). Each bar represents a keyword and the height represents the number searches and the bars are tightly packed (call this the "all traffic curve").

For a given rank you grab a subset of that distribution for your site. When you are starting you grab only a few bars so on your graph the bars are widely spaced, because your content doesn't cover all those keywords. Your rank determines your traffic which is represented in this case by the height of the bar. For the sake of argument your rank gets you 10% of the traffic and thus the bars are 1/10 the height of the "all traffic graph" this recreates a curve that looks like the "all traffic" curve but is below it. But since it is below it, it cuts off the tail. [pause breath...., I said it was complicated!]

Now it's a new site, not much content exists, and there is plenty of space between the bars. Your site also includes some content that is in the tail, but it gets no traffic because of a low rank. Now you create more content, thus new keywords, and you begin to fill in the gaps between bars. Traffic grows with the new content. But remember, the shape starts off with a few really high bars and they quickly become shorter. So as you create content, the gaps between the high bars get filled in early, as you have already covered all those keywords. But new content still adds keywords but now most of the keywords are added to the tail, where you don't get any traffic because of your rank. Now new content is no longer increasing traffic.

But Google sees all the great progress and steps up your rank, boom much more traffic in an instant. You get more from the body (the high bars), but now you also capitalize on the content in the tails as your graph covers a larger area. Happy and proud of the new traffic, you add more new content, the body is still full so you keep adding to the tail, but the content that you add doesn't really bring so much traffic because the search volume is low in the tail and much of the tail is still cutoff. You now see the diminishing benefit of adding new content. At some point you can add all the content you want and it will get no new traffic.

There is no nefarious control here, you are simply observing what is explained by the math. The only lever of control that Google has is the ranking algo. Nothing more is needed and ultimately the outcome is the same.

frankleeceo

5:14 am on Sep 10, 2021 (gmt 0)

10+ Year Member Top Contributors Of The Month



@Nick my man

Thanks for the thorough explanation. Let me see if I understand it correctly by repeating what I read out of your math/build

1. I think keywords are a bad way to put it, I really look at the traffic data through GA to the different content of the site. This makes sense based on your number 2, when we removed the content, and the Algo simply filled up the other "bars". So this part makes sense and is logical, and I don't think you are disagreeing with what we saw.

2. Based on your projects/model:
Total traffic = total "distribution" area underneath the bars = controlled by the number of keywords that your site can rank.
The algo determines the bars or keywords that you rank - and filling into the "distribution".
As you build content, if your site "distribution" doesn't rank for the bars, you get no traffic from those bars.
When your site ranks higher, you start to get traffic from the bars from more "distribution".
By algo controlling the keyword ranks, it ultimately closely or "magically control" or "rank in/out" the amount of traffic released to a site for whatever keywords.
Finally, the "distribution" determines the amount of traffic a site receives, keyword ranks are really there just to fill in the blanks. So keywords don't really matter. The "distribution" is the only metric that matters to traffic volume.
As site grows, the "distribution" increases over time.
If you expand your content faster than the "distribution" increase, your traffic becomes limited by the "distribution"

The "distribution" is starting to sounds a lot like "throttle" in my dictionary :)

Throttle dictionary definition:
"a device controlling the flow of fuel or power to an engine"

Well, I had a hunch from discussing with you years back, my idea of "traffic throttling" is really just how you define ranking. I just called it differently than you :). So I stopped talking then. "Nothing more is needed and ultimately the outcome is the same.". Yeah I agree, I didn't think the system is evil by itself, it's just a system & necessary model.

I think the main downfall is that the general "public" doesn't think ranking the way you do. And most people have trouble understanding ranking doesn't equate to traffic. "Throttling" is much better descriptive way to describe this outcome in my opinion.

manu14

10:24 am on Sep 10, 2021 (gmt 0)

5+ Year Member Top Contributors Of The Month



@NickMNS
Your answer seems very knowledgeable and perfectly logical - but seems to me that it reads the same as what @frankleeceo is saying. Google has set a limit to a site's traffic and that is not changing - ranking, bars, distribution, throttling, nefarious control whatever you call it.
This 31 message thread spans 2 pages: 31