Forum Moderators: Robert Charlton & goodroi
I have many sites in my Google Webmaster tools account. Some of the websites are crawled very slowly, and I've noticed that some of them have the "Faster" option available, with the following message being shown:
We've detected that Googlebot is limiting the rate at which it crawls pages on your site to ensure it doesn't use too much of your server's resources. If your server can handle additional Googlebot traffic, we recommend that you choose Faster below.
After setting it, Google claims this option would return to "Normal" after 90 days.
I'd like to understand this option better. Why does Google decide that for certain sites the crawl rate would be slower?
Thanks
Google's crawlers are adaptive
Agreed - and in more ways than server response time. The crawl team has logic in place that allocates their resources according to how valuable crawling any particular url is. The best way to get frequent and deeper crawls is to have a website that Google sees as very important for their end users. To the degree that you do that, you will get more crawling resources.