I'd like to prevent massive spikes in googlebot crawling activity and I'm not satisfied with using google webmaster tools to achieve this since googlebot activity increased dramatically not long before I was Panda smacked on many transactional pages.
How would I best set up a limit that once exceeded over any given period would deny access without causing (too many) ranking problems? Has anyone done this successfully? If so, how?