Forum Moderators: Robert Charlton & goodroi
Are search results being throttled ?
Representativeness heuristic
throttling is not legal regardless of the country you are doing business in even if you are referring to organic only
And once you have the user engaged you can adjust their experience to the point where an outside observer would say goes too far. After a viewer has visited a number of pages on my site i force them to register.
With googles authority they can get away with all kinds of manipulation that on the surface looks like poor user experience but in reality works for googles bottom line simply because the viewers don't know where else to go.
I guess Google can't decide WHO clicks those links (ie, good or time-wasters/"zombies" as I read on here) - how would they know the user's intention?
Enhanced cost per click (ECPC) is a bid strategy that automatically adjusts your manual bids for clicks that seem more likely to lead to a sale or conversion on your website.
Zombies and nonconverting traffic is another story that I really think shouldn't be discussed in detail here with the focus on throttling.
It is all done on a scaleable AI machine learning level. Sometimes they're right, and sometimes they're wrong. It's consistently evolving.
I should elaborate...Non-converting zombie traffic had been discussed in multiple places and multiple threads including monthly chatter and multiple dedicated posts. This is the first time that the actual "volume" based throttling being discussed. And I think your right, zombie traffic does indirectly verify the existence of volume based throttling, since filler traffic is sent to reach the site's given quotas.
Personally, I think there is a deliberate attempt to deliver a consitant level of traffic to some sites- and further that this is reasonable as cycling through 100 potential destinations is better than only showing a static 5 above the fold on a single SERP. Especially if you give different frequency weightings to those sites, and use Personalisation data as part of the selection criteria for what subset you surface in any given circumstance.
Can we argue that throttling/zombies occur for first page results or it is website based like carrying a restriction to quality traffic?