I remember 15 years ago, I was being approached to delete DMOZ links that were making competitors rank.
Which was an exercise in reducing the amount of "good" the algorithm was calculating for a particular site, not adding negative reflections on it.
The problem with the approach Google has these days (in my worthless opinion) is they got way to infatuated with negative factors. Way back in the day they would gather up all the "signals of quality" sites had, crunch their algorithm and then rank the sites based on their scores.
Then about 6 years ago they got frustrated with their inability to shoo away all the sites they felt were ranking higher than they wanted them to. So, someone at the plex came up with the bright idea to run around gathering up "signals of bad quality" (traits Google felt were disingenuous to their goal of having sites they prefer rank higher than ones they didn't).
So, "bad traits" took on more importance than "good traits" because the focus became more and more about rooting out sites that would rank high despite Googles disagreement with that; "These sites are low quality, lets find some common traits and algorithmically punish the ones that have them and drive them from our midst".
Now we have web masters running around the internet like lunatics screaming at owners to remove links to their sites, deleting half their content because Google probably thinks its "to thin" and "SEO" companies offering services that can "de-rank" sites....
Sometimes you gotta live with a little bad to bring forth the good, sometimes what you think is bad, ain't so bad to others and they might be able to decide that for themselves with their trusty back buttons.
Between the over kill on cleansing the results of these bad traits, and the over reliance on signals that large brands give off, the results are a lot like vanilla ice cream these days - there good, but man are they bland and what once was a fun experience searching around is now pretty much an exercise in finding good deals on replacement parts.