We've been having some issues, and I'd like to see if anything has experienced anything similar:
Firstly we inadvertently starting serving pages as NoIndex,NoFollow. This lead to many pages dropping out of the index. We fixed this issue, and have been waiting for Google to reindex the pages.
Secondly, we've seen many "Unreachable" errors for URLs submitted in our XML sitemaps via WMT. We've also seen many errors related to Crawl Rate (i.e.: "Crawl rate problem
We were not able to download your Sitemap file due to the crawl rate we are using for your server.") I went ahead and increased the crawl rate to max, but I think we are still in the doghouse with regard to crawl budget.
Thirdly, we've seen the activity from Googlebot (via Google WMT) plummet to near zero levels, when we've seen levels as high as 500k a day.
And as a side note, when using the "Fetch as Googlebot" feature, we can't even fetch our homepage (which is still in the index).
Could this be a weird error on the part of Google Webmaster Tools? Or could our Crawl budget have been severely cut due to the previous problems (noindex'd pages, outages). I.e.: Could the "Unreachable" error via "Fetch as Googlebot" tool but an indication that we don't have crawl budget allocated for our site?
I have a suspicion that the rampant resubmission of XML sitemaps following the noindex debacle could have led to a red-flag for crawl budget, but I don't know anything for sure.
Any help would be greatly appreciated, as always.