I used to always set my crawl rate by let Google determine by default which they said "Recommended". Then several months ago I noticed that crawl rates from Google was drastically dropped, so I change the option by set my own custom crawl rates by increased them from the slide bar to 250% more. It was working fine since then.
However, Google send me a warning message that they were unable to crawl my site due to 'Low Crawl Rate' setting few days ago. And this is also impact to the number of pages indexed showing in WMT as well.
So I checked my custom crawl rate slider bar again, but it was almost to the maximum speed that I can adjust which I can not go much further. I just can't figure it out how this speed can be considered as a "Low crawl rate" from Google according to their warning message.
Anyone experience the same situation here? Appreciate any comment.
When I had a crawl rate issue about a year ago, it was due to sluggish response times on my server. I improved the underlying software and that solved the problem for me. You might check the Crawl Stats section in WMT and look for spikes in the downloading times graph. Google will override your WMT setting and throttle back your crawl rate if it sees a sustained period of slow responses.