126 requests for robots.txt in just under four minutes.
8:28 pm on Jul 26, 2014 (gmt 0)
Good job you don't run a large website with sitemaps or the muppets in Bing would be trying to download the entire sitemap files each day.
8:37 pm on Jul 26, 2014 (gmt 0)
jmc, I do have a custom site map, however Google is the only one that will utilize absent a fee.
Bing does crawl the entire site fairly often, however has failed to pickup page editions in a specific section that have been in place for more than a year.
8:45 pm on Jul 26, 2014 (gmt 0)
The sitemaps for the site here are about 3.5GB. Bing's idiocy in not understanding the basic protocol means that they try to download non-updated files each day. Google seems to be a lot better and does pay more attention to the protocol. When it comes to some areas of search engine work, Bing seems to be run more by dilettantes than professionals. They also seem to use a number of IPs for hitting the sitemaps that tend to operate largely independently. From what I remember, the robots.txt requests from Bing tend to appear in bursts so it could be a number of these IPs hitting for the robots.txt file in succession.
10:43 pm on Jul 26, 2014 (gmt 0)
On one 250 static html (including a sitemap.xml) page site I manage, bingbot and msnbot request every single page 4x per day, each.