peego - 4:57 am on Jul 6, 2012 (gmt 0)
Actually, I compared the our MRTG chart (shows bandwidth use) to our WMT googlebot crawl rate, and there's a huge difference between crawling the average 2000pg/day and peaks of 5000pg/day.
When it crawls 2000pg/day, the MRTG shows outgoing bandwidth at normal 400kb/s, and when it crawls at 5000pg/day, MRTG shows outgoing bandwidth triples to around 1300kb/s. So it's using up a lot of bandwidth.
Can someone knowledgeable about this please make some recommendation on what I should set our Crawl rate to be? How many pages should googlebot be crawling for a 1500page site? and what crawl rate setting should be set at to achieve that?