peego - 3:43 am on Jul 6, 2012 (gmt 0)
Just wondering if lowering googlebot's crawl rate will affecting rankings in the SERPs?
For example, the default in WMT is 0.1 request per second and 10seconds between request. If I lowered that to say 0.033 requests per second and 30.303 seconds between requests, what sort of impact would that have?
The reason I'm asking is because apparently, googlebot is causing abnormally high bandwidth use on our server. We have cache enabled using Nginx on our server (according to webhost), and that seems like the googlebot activity is causing our outgoing traffic jump from normal 400kb/s to over 1400kb/s. When they blocked googlebot's ip for 5 minutes, it dropped back down to 400kb/s and then shot back up to 1400kb/s after allowing it again.
Can anyone please make any suggestions on what I should set our googlebot crawl rate to be? What sort of impact would it have? And will it affect rankings?