My bandwidth on a couple of sites is getting absolutely caned at the moment. My bandwidth usage is up 3 times on last month already! I've had to suspend one virtually unused site before it uses my entire reseller quota.
When I check the logs Googlebot/2.1 is at the top of the user agents list. However, this much bandwidth seems excessive for crawling.
Actually, it seems to be eating up bandwidth faster than I said. I think I have 36gb - at the rate it's going I reckon it might get through a fair proportion of that. The culprits are: 184.108.40.206, 220.127.116.11 and 18.104.22.168
Are you sure Googlebot is not caught in a spider trap? Common culprits include session IDs or scripts such as calendars which have an infinite number of pages (eg. there's always a link to the next and previous month, to infinity).
If it is becoming a problem, report it to firstname.lastname@example.org, and if necessary add some robots meta tags to exclude the bot from areas where it has become trapped.