Ok so there are a few issue that your are not taking into account, and they are subtle but real.
1- GSC keyword data is garbage (to put it nicely), and cannot be relied upon for anything. (3rd party data is worse).
In OP and my case a few years back, we turned off the knob intentionally on our end, and Google "turned on" the knob else where to flow same amount of traffic to site, with completely different keywords.
I'm not doubting what you are saying but with the state of the keyword data there is no way to make such a claim. The reason, as I mentioned above, is that Google does not show you all the keywords, it shows a small sample and that sample is not statistically representative of all keywords. Google tries to show as many different keywords as possible and when shown once continues to provide data for those keywords whether or not they remain relevant and it doesn't show the keywords that are most relevant.
When Apple rolled out iOS14 they included a bug in the Google search app, such that when a user is logged in their searches appear as "Direct Traffic" but the app adds query strings to URL and that gets reported in Google Analytics and in your server logs. One of the parameters included in query string is "as_q" which is the keyword entered into Google search. On my site I get about 1% of my traffic from this source, it is not a lot, but if span a week or month you can get a large enough sample. The beauty of the data from Apple is that it can be considered a random sample and should be statistically representative of all traffic. So from that data one can make inferences. You can then compare those results to the search terms shown to you by Google. Google's data is biased. It isn't nefarious, from what I can tell they try to provide diversity, but that messes all their stats rendering them completely useless.
So yes remove a page from search, and Google will show a different keyword, but that doesn't mean that your actual traffic is coming from that keyword. What you are seeing is an artifact of Google's crappy reporting.
2- The relationship between published content and traffic is not linear it is diminishing.
So with that spirit, if you build a million pages all targeting different keywords, should you receive that incremental traffic?
No! I have tested this and it is absolutely false. And it becomes "more false" with scale. (ok not really, because more false is not a thing). My main website, has tens of millions of pages of unique content, and I have worked on projects with even more (hundreds of millions). I don't know how to explains this simply because it is complex and it has to do with the distribution of the frequency of searches and the "long-tail" of the distribution. The distribution is exponential, bars that start off (left on the graph) are very high on the graph and as you move right the bars quickly come down to just a small whisker. These whispy bars are many and stretch far to the right (the long tail). Each bar represents a keyword and the height represents the number searches and the bars are tightly packed (call this the "all traffic curve").
For a given rank you grab a subset of that distribution for your site. When you are starting you grab only a few bars so on your graph the bars are widely spaced, because your content doesn't cover all those keywords. Your rank determines your traffic which is represented in this case by the height of the bar. For the sake of argument your rank gets you 10% of the traffic and thus the bars are 1/10 the height of the "all traffic graph" this recreates a curve that looks like the "all traffic" curve but is below it. But since it is below it, it cuts off the tail. [pause breath...., I said it was complicated!]
Now it's a new site, not much content exists, and there is plenty of space between the bars. Your site also includes some content that is in the tail, but it gets no traffic because of a low rank. Now you create more content, thus new keywords, and you begin to fill in the gaps between bars. Traffic grows with the new content. But remember, the shape starts off with a few really high bars and they quickly become shorter. So as you create content, the gaps between the high bars get filled in early, as you have already covered all those keywords. But new content still adds keywords but now most of the keywords are added to the tail, where you don't get any traffic because of your rank. Now new content is no longer increasing traffic.
But Google sees all the great progress and steps up your rank, boom much more traffic in an instant. You get more from the body (the high bars), but now you also capitalize on the content in the tails as your graph covers a larger area. Happy and proud of the new traffic, you add more new content, the body is still full so you keep adding to the tail, but the content that you add doesn't really bring so much traffic because the search volume is low in the tail and much of the tail is still cutoff. You now see the diminishing benefit of adding new content. At some point you can add all the content you want and it will get no new traffic.
There is no nefarious control here, you are simply observing what is explained by the math. The only lever of control that Google has is the ranking algo. Nothing more is needed and ultimately the outcome is the same.