Forum Moderators: Robert Charlton & goodroi
Google collects documents on the web using Googlebot, our web crawler. You should leave this control at the Normal setting unless you are having trouble with the speed at which Googlebot is crawling your server.
We've detected that Googlebot is limiting the rate at which it crawls pages on your site to ensure it doesn't use too much of your server's resources. If your server can handle additional Googlebot traffic, we recommend that you choose Faster below.
Google makes these statements in Webmaster Tools.
We are currently on "faster" crawl and this will expire in 11 days.
But if we need to extend the "faster" crawl beyond the 3 month expiry, how is this done? I can't see anything that indicates this will be possible, and experimentally changing one of our sites to "normal" greyed out the option of "faster" - so resetting and starting again doesn't seem possible.
The rate at which Googlebot crawls is based on many factors. At this time, crawl rate is not a factor in your site's crawl. If it becomes a factor, the Faster option below will become available.
As i thought, after 3 months our site has reverted to normal crawl - but it appears to me that we require a faster crawl since it previosly averaged a small no of average visits which are inadequate for a complete crawl.
I wish i knew how, or if, this can be re activated.
I've got pages that have been changed 5 time since their last cache date, so Google needs to get the lead out! Its cache is OLD and OUTDATED.
Unfortunately, i don't know how to reinstate those that only have "normal" and "slower" options, back to "faster.
Aaron ...... i had a fossick around on your blog, but couldn't locate you article. Do you have the giste of it available, since maybe i'm putting too much importance on it.