There are some sort of different forms of filters apply to the websites.
What kind of filters or "caps" that you notice on your sites when it comes to google sites?
What I notice with my sites are that the traffic volume is very heavily based on user metrics. And possible incoming links and all those SEO goodies.
Like...If your site is age X with X type of user metrics, you are "allowed" a percentage of traffic flowing to your site.
For example, for a popular term with 100k searches, a decent site can receive 20% of traffic which is 20k.
However, for a less popular term with 10k searches, a decent site also at 20% of traffic would also receive 2k.
The filter will filter through the range of possible queries and crowdsource users toward different sites to mine their resulting behaviors.
The filters act less on a strict number sense and more on a percentage sense in relation to the overall search pattern of "users"
Other than the search term filter, there also seems to be a "site specific" search volume filter which caps the total amount of traffic based on again user metrics and types of site. This filter seems to be strictly number and percentage site over site bases.
Such as, your site can only grow "organically" by 10% week over week when you are in tier 1. Only 20~30% week over week during the growth spurt stage.
Does anyone have similar thoughts or observe similar phenomenon? I want this discussion to focus on the filter aspect in regard to search traffic volume only.
I wish I could zero in on various filters in a specific way. I never have had any success.
I studied mathematics in college, and I'm aware how deceptive "patterns" can actually be. It's very common to think I see one in a site's analytics only to see a data spike a few weeks or months later that changes my idea.
Ultimately, I stopped looking for such things, as intriguing as the idea is. Even if I nailed down a specific filter, it's not very likely that I would be able to formulate an action based on that - so I stopped using my energy that way.