"Is it possible that age, a.k.a. being an established site, used to be a ranking signal but has been deprecated or dialed back?"
Perhaps that's a factor, but the huge downward pressure on older sites in Google vs our standing in every other search engine indicates something is wrong. Note that this excludes those massive brands, which it's important to note are immune because of their sheer size.
Personally, my gut feeling is that 25 years of link detritus is probably the root issue. New sites have amazing good:spam backlink ratios, and you can imagine an algorithm taking a look at that backlink profile as a whole and going "well this looks pretty good". On older sites that ratio drops fast, and after 20 years you end up with what looks like the remnants of a very long, very wild party.
I'm not sure about your site, but our site has a TON of legitimate backlinks (many thousands), but those are mixed in with tens of thousands of spam backlinks from thousands of domains. Scrapers / malware sites love taking snippets of our content and then adding backlinks to us to try to make themselves look legit. We have battled this for two decades now, and while Google has gotten good at stopping this stuff over even the past couple of years, it still misses some of it.
The bigger problem I think is backlinking that morons do over the years that looks suspicious but occurs on non-spam sites. So, someone in 2005 posts a bunch of links on a forum to one of our pages, and then a configuration issue or non-canonical URL variants end up with that link appearing on a thousand different forum pages. Now take wildy varying examples in scope and size like this 500 times over a 20 year period. All of this in aggregate looks unnatural, and I suspect there is a decent chance Google could demote our site because of it. It's not so much a single case of "unintentional unnatural linking", but when you get a large number of independent instances over a 25 year period, I could totally see Google reading something into it that it shouldn't be. Or something.
You know how GSC lists the top 1000 backlinking sites to your website? Well our list, as displayed by Google Search Console, is at least 50% spam / malware sites. A great deal many others are parked or 404 domains (probably from spam attacks years ago). If you were to judge the authority of our site based on what GSC's backlinks display, your only conclusion would be that our site is trash. Total trash. Google says they devalue spam, but then they fill GSC backlinks profile ("Here's what we're judging your site on!") with it. Doesn't fill one with much confidence that they are ignoring those sites and links, does it?
Services like Semrush's backlink profile tool have a coronary every time they look at our backlink profile. In the "highly toxic" category, it shows 6000 toxic domains with tens of thousands of backlinks. And I know that this is just a tiny portion of the total out there.
Now, I want to say that I'm sure the good folks at Google do what they can to devalue spam links. I get it, it's a really hard problem, and I have read every word everyone at Google has said on the issue over the years. They're confident, and that's great. But, I really think they have a hole in their backlink algorithms when it comes to our older sites. The crap:good noise level is just so high, one has to wonder if they are seeing the quality forest through the 20-years of backlink polluted trees.
Perhaps it's something else... maybe there's a bug in 301 handling, so that old sites with large numbers of 301s are getting weirdly pagerank dilluted or something, or some statistic table isn't getting updated with the current state of things and instead is aggregating stale data from 15 years ago. Or perhaps there is effectively a time-penalty for sites with large numbers of webpages written 15 years ago -- perhaps an algorithm sees 80,000 pages written between 2000 and 2010 and says "site is stale", even if those pages are evergreen (like ours are) and constantly updated for accuracy as time moves on (as ours are).
One other interesting data point: I'm not sure if you know this, but if Google sees a directory has a lot of low quality content, it will take a grudge on that directory and hold onto that grudge *even if the content improves significantly*. We had a part of our site that contained very thin content, and although it was fine and useful for our actual users, we could see Google was getting mad at us, so we improved that content. Google refused to revisit any content there that it had marked as crawled not indexed & discovered not crawled. For *over a year*, even after it had reindexed the vastly improved content, it held a grudge on that /directory/. It hated that directory. Extremely high excluded/crawled but not indexed in it, far above anywhere else on the site. And even as it sucked in the new content and saw that the content was now good, it still hated that /directory/. I even ran an experiment... I took some pages from /directory/ that Google was refusing to even LOOK AT (discovered not crawled) and moved them to a fresh /directory2/. Google indexed them ALL overnight! Same content, different location. So, Google's algorithms hold statistical grudges, and they're not particularly good or fast at updating their statistics for things like "low quality content lies in /directory/". It's possible that stuff like this can hurt older sites too -- sins of thin content or duplicate content from years ago.... are incidents and transgressions from 10 years ago that were long ago fixed haunting us? Is it simply an issue of the older your website is the more minor grudges Google holds, and over time those minor grudges/algorithmic demotions create big issues?
There are just so many ways that legacy data can interact in unexpected ways with new algorithms. Shrug. Sigh.