Googlebot is a phenomenal crawler. The best at what it does and I can't see it beaten anytime soon. However I can't help wondering if in order to create a fair market that Google should be forced to become a raw data provider and a market of Search Engines would use that data to produce their own search results. Of course they would be able to choose different providers of data to enhance their results and Google Search would become one interpretation of googlebots raw data alongside competing Search Engines that may choose to interpret the data another way. This could create a fair and competitive market in Search.
Crawling is actually one of the more straightforward elements of running a search engine. It's primarily data retrieval. There's a whole host of good Open Source options if you want to run your own crawler, from Nutch to Heritix. You still need the bandwidth and storage space, of course, which may be what you're hinting at.
But the crawler side (in terms of the software at least) is not providing much commercial advantage to Google. The key is in processing the data that the crawler uncovers. I think you'd be hard-pressed to argue that Bing's problems lie with lack of data.
And as lucy24 says, even if there was a good argument in favour (which I don't believe there is) how would you force Google to do such a thing?