I guess you have to decide how important google cache is to your site before you worry about it too much.
"404. That’s an error. The requested URL /search?q=cache: (example site here) was not found on this server. That’s all we know."
Well, the site is on the server, its not a 404, and "Thats all we know" isn't true (they certainly know why its happening). In the past, when you would get this occasionally, it wasn't hard to shrug your shoulders and "not worry about it" because inevitably it would go away. I think most of us, over many years, have gone about our business with the understanding that what Google displayed in their search results, was based upon a cached copy of a site, a reference point that in most cases you could take a look at.
But when Google pukes up a 404 consistently for three weeks, for 50 sites, I wouldn't say it causes me worry, but more curiosity and the suspicion that comes from them trying to casually brush off the question.
There is no cause for concern because at this juncture theres nothing you can do about it, and there is no evidence its causing anyone difficulty. But the question remains; why is this happening, why is it so widespread compared to the past and why is it dramatically increasing. What is your feeling if it got to the point where being able to see a cached copy of a site was the exception and not the norm?