Google could solve that problem in 5 minutes by looking for a crawl-delay: 0 for googlebot in robots.txt.
That would be treated as permission for crawling at high speed.
If they did that, and let it be known that they were doing that, webmasters could decide to add that or not.
It's not a matter of webmasters' complaining. It's to do with secondary web activities (like index builders) honoring the wishes of the primary movers (those providing content).