Try exact-text searches for a random selection of the “crawled but not indexed” pages and see if they really do not come up in the index.
designergweb, while I agree with many of the comments in this thread about approaching GSC with some skepticism, there is a trend with large sites not noted in the comments, and that is that Google is no longer simply indexing everything that's been submitted... I therefore would pay some attention to the "crawled but not indexed" pages you see reported, as they may actually be telling you something real about your site. That said, GSC's reporting function is necessarily the last link in a long chain in the indexing process, and Google's indexes are so large that there's no easy way to change that.
First, do check lucy24's suggestion, and see if searching for exact text strings (ie, try searching for text strings on these pages using quotation marks) and see what you find. IMO, exact text searches are the most dependable way to determine whether the pages you're looking for have been "indexed" in the particular layer of Google indexes that you care about... the visible pages... those that are shown in the serps and are considered for ranking.
There's been a history, though, in the past year (actually longer) of Google trying to discourage the use of the Fetch tool as a way of getting pages into the serps, as it's been misused greatly by spammers, and the tool was never intended for mass submissions, It did become for a while kind of a mini-industry, where so-called SEOs who knew about the tool used it as a way to get less-than-adequate pages into the index.
See the first of our threads on the topic here, spearheaded by some very persistent reporting by Barry Schwartz of seroundtable, whose posts we frequently cite in the thread.... Big reductions in crawl-to-index limits on Google Fetch tool March, 2018 https://www.webmasterworld.com/google/4893740.htm
From Barry's article, quoting Google's John Mueller...
...The "Request indexing" feature on Fetch as Google is a convenience method for easily requesting indexing for a few URLs; if you have a large number of URLs to submit, it is easier to submit a sitemap. instead. Both methods are about the same in terms of response times....
Note also item #5 on the page...
5 - Recrawling is not immediate or guaranteed. It typically takes several days for a successful request to be granted. Also, understand that we can't guarantee that Google will index all your changes, as Google relies on a complex algorithm to update indexed materials.
Over time, Google raised the bar on quality it would accept, and imposed severe limits on submission quantities. The indexing problems in the early part of this year, I think in part came from an algorithm in this scenario which had a glitch in it.... and from time to time I think that these problems might also have been used to obscure the nature of the changes that were being made... but that's personal conjecture.
It's hard to say how much of what you're seeing now, in the way of "Crawled, Currently Not Indexed", goes back to how your site achieved its indexed status in the first place, and to these recent changes in Google.
Please post your observations as your indexed status evolves. I should add that I myself prefer to rely on natural crawling, as that gives me a better view of how the site structure and its integration with the web are performing.