Forum Moderators: Robert Charlton & goodroi
Massive jumps in GSC legacy crawl errors - who sees this?
So you think that is more likely the crawl errors are not associated with how they are testing or rolling out the update...
Google is going to be hard-pushed to keep denying any relationship between legacy crawl error increases and updates involving links
If you won't accept that anything happens until you get that level of proof, you're not going to find any satisfaction on these forums!
...The number jumped by 90,000 a few weeks ago....you are not the only one who's noticed that this amount of data is too much for the average user.
This massive number of unactionable / ancient legacy links from 5 to 15 years ago is making this part of webmaster tools useless for us.
5) We list crawl errors in Webmaster Tools by priority, which is based on several factors. If the first page of crawl errors is clearly irrelevant, you probably won't find important crawl errors on further pages.
https://webmasters.googleblog.com/2012/03/crawl-errors-next-generation.html
Less is more
We used to show you at most 100,000 errors of each type. Trying to consume all this information was like drinking from a firehose, and you had no way of knowing which of those errors were important (your homepage is down) or less important (someone's personal site made a typo in a link to your site). There was no realistic way to view all 100,000 errors - no way to sort, search, or mark your progress. In the new version of this feature, we've focused on trying to give you only the most important errors up front. For each category, we'll give you what we think are the 1000 most important and actionable errors. You can sort and filter these top 1000 errors, let us know when you think you've fixed them, and view details about them.
Make of that what you will, but it looks to me like Google might have suffered some kind of data glitch recently, which could account for a lot of errors on a big site.