rowan194 - 7:26 am on Aug 15, 2012 (gmt 0)
I did some quick calcs and realised this is most likely going to be an ongoing problem. To fetch 200 million pages from my site at 1 per second would take 6 years.
As an interim measure I've created a GWT account for 18.104.22.168 to try to slow Googlebot down; unfortunately they won't let me change the crawl rate yet.
"We do not have enough information about your site at this time to allow changing the crawl rate. Please visit again later."
This is probably an edge case as most sites would not have 200m+ pages. Still, it's very frustrating that I need to create 3 different GWT accounts for the same site, and that I have to manually set the crawl rate every 90 days. I wish Googlebot would recognise and respect the crawl-delay directive... other popular crawlers do.