I'm looking for some more data points, so if someone can confirm or deny this, I'd be much obliged.
Do you guys see more 404 errors in the "Crawl errors" section in the last couple days? I mean, the errors most likely existed for quite some time but haven't showed up in WMT before.
I have a number of sites that suddenly started showing significant number of 404 errors (in tens of thousands on larger sites), all due to the same pagination error having to do with a plugin incorrectly counting the number of comments. Since I haven't made any software changes lately, (at least nothing that would involve so many of my sites) I can only assume that these errors were there for a LONG time (years in some cases) but haven't been reported by G.
Most of those errors show "Detected" from just the last week. Since there hasn't been such tremendous influx of Gbot visits, I would have to assume that in order to pick up so many new 404s Googlebot would have to visit ONLY erroneous pages. I just can't see how's that possible.
Does anyone else see this? Can anyone offer a theory about why would they suddenly decide to show the errors which I suspect they've known about for quite some time but never divulged?
Thanks!