The sitemaps for the site here are about 3.5GB. Bing's idiocy in not understanding the basic protocol means that they try to download non-updated files each day. Google seems to be a lot better and does pay more attention to the protocol. When it comes to some areas of search engine work, Bing seems to be run more by dilettantes than professionals. They also seem to use a number of IPs for hitting the sitemaps that tend to operate largely independently. From what I remember, the robots.txt requests from Bing tend to appear in bursts so it could be a number of these IPs hitting for the robots.txt file in succession.