tedster - 6:05 am on Jul 6, 2012 (gmt 0)
Bandwidth is not the right metric to be concerned about - server response time is. If googlebot hits are not slowing down your response to other visitors, then there is no ranking problem. Lowering your crawl rate in such a situation might save you bandwidth, but aby decent ISP should be offering enough bandwidth so that googlebot is not going to increase your hosting costs.
The best way to handle googlebot crawling, in my experience, is to let their crawl routines do their thing. There will be highs and lows - just let it be. Now, if you have a lot of canonical errors on your site so that too many "duplicate content" URLs are being fetched, then you need to give the site tome technical attention to fix that problem.
And in some cases, googlebot can have troubles - usually for a short period. But again, I encourage you to find a workable way not to use those crawl rate controls.
There is no ideal "setting per number of pages". How frequently is your content updated? And similarly, how "fresh" are the spaces you are competing in? That also makes a difference.
If freshness is not a major factor, then frequent crawling may not be so important. However, there are many members here who WISH they could get a higher crawl rate, because their indexed content is too stale - and that can have a negative effect on traffic.
So be very careful about taking your crawl rate off auto-pilot. I've almost never seen that help a site. Think "big picture" here.