rowan194 - 4:10 am on Jun 14, 2011 (gmt 0)
Before the custom crawl rate was introduced (and I was able to ask Googlebot through GWT to calm down) it wasn't unusual for G to fetch more than 120,000 pages a day. They ignore the Crawl-Delay robots.txt directive.
It wasn't so much of an issue back then, but now that my database complexity and size has grown, each page takes more I/O to render. The majority of my server load is dedicated to servicing Googlebot's requests!