"It appears from this thread the general consensus is older niche sites have lost rankings to spammy, poorly written content, or high authority news sites without substance, mainly USA traffic."
I think there is another explanation for the "old niche site" decay issue that doesn't get much discussion: an issue I call "legacy data choking".
Our site is now 25 years old. We were THE first in our niche, were featured on CNN, etc in the late 90s, had a ton of authority. We hit our peak around 2007 or so, and it has been down, down, down ever since. And by that, I mean literally every single core update. Just a slow and steady march down, even though our content has actually improved considerably in both scope and quality over the past decade. Yes, the glory days are gone for other good reasons: Google adding more ads, videos, etc into its SERPs, people spending more time on apps vs web, etc. But the intense march downward I have seen in our site and other "very old" niche sites is nearly universal.
I think one of the main problems is that we have to battle the accumulated detritus of 25 years of redirects, 404s, canonicals, SEO attacks, and tens to hundreds of thousands of poor quality backlinks that simply have nothing to do with us.
When I look at our Google Search Console, our coverage is 288K valid pages, 457k "excluded" pages. We have 61,000 pages GSC has marked as "page with redirect". Most of those 301s were put in place 5+ years ago.
We have thousands of pages that were redirected 20 years ago (and even some long-ago 404'd pages!) that Google STILL TRIES TO CRAWL. We did another site overhaul 10 years ago and google STILL TRIES TO CRAWL those ancient links, because long-abandoned websites and blogs still reference them. We still get thousands of crawl attempts on old HTTP pages even though we switched to HTTPS 7 years ago. Our "link profile" is a complete and utter mess because of all this legacy clutter. I suspect Google is absolutely choking on our 25 years of history and our ranking is suffering seriously because of it. Who knows how much crawl budget Google spends on this legacy crap, but it is extremely significant.
Our backlink profile is a similar shambles. As an informational source, our website has been skimmed and scammed and copied thousands of times. Our text has been lifted and spread. Spam/spun/doorway page hacked-domain operations often use us to try to make them look more legitimate, so we have tens of thousands of backlinks from thousands of spammy domains pointing to us. We have had times in the past where bad actors use black-hat methods to create proxies and in real-time scrape data from our site to create ad revenue of their own. Twice (2005 & 2019) out site was effectively entirely cloned and in many cases the cloned sites outranked us.
I have spent so much of my tiime over the years battling all of this crap. It has been a complete and utter nightmare. It has been endlessly stressful and time consuming trying to prevent bad actors from harming us in Google.
As I write this, our link profile is a complete and utter mess. Due to 25 years of activity *from other actors, none of which are in any way related to us* and many of which are automated, 95% of our current link profile is spam, hacked domains, crap, and stuff that has no value. Our disavow file is at 8000 lines.
So to Googlebot, our site looks like a complete mess. Even though our disavows are in place and our 301's are correctly configured, to Google we probably look like a huge messed up ball of wool --- a huge web of tangled 301s and backlinks. God only knows how the algorithms have been choking on all of this ancient data, trying to make sense of our site. At best all of this legacy crap has probably watered down our rankings, and at worst has resulted in us being severely and increasingly penalized with each and every core update.
Now picture a new site, created in 2021. It has a beautiful link profile because it is new. It doesn't have tens of thousands of crap links pointing to it, its internal linking structure is obvious, and it has no 301s. Its structure is obvious and clean.
We remain #1 over thousands of keywords in every single search engine except Google, which is evidence that we haven't declined due to competitive issues. In fact, I can only name one other site over the past 20 years that has been able to compete with us in quality and extensive content, and even their rankings appear to be dropping. We stand proudly in #1 over core keywords *everywhere* except Google. Google hates us. Google thinks we are a hot mess. Google doesn't understand us. And Google continues to drop us.
It is my hypothesis that Google's algorithms were designed to work fairest when applied to new websites. I think that when it tries to process some of our older websites it simply doesn't do a good or fair job because it simply was never tested against such extreme cases. Machine learning probable makes it worse too... Google "learns" what a healthy site structure and linking profile looks like by mostly learning from newer websites, and then looks at 25 years of data from our site that it has collected over the decades and has a coronary trying to digest it.
It is my theory that unless you are a huge site with such an overwhelming authority that this legacy stuff doesn't matter, that you simply enter a downward spiral.... caught in the cracks of algorithms that simply can't make sense of so much old data.
We have noticed this issue because we were so early on the web. I expect more and more of you will get caught up in this trap as time goes by and your "web histories" start working against you, and the issues I'm bringing up here will become more talked about. Perhaps some day Google will tweak its algorithms to correct for such problems.
Until them, we will continue to circle the drain.
[edited by: westcoast at 2:10 pm (utc) on May 12, 2021]