The incidents above involve 80legs.com's spidering at a rate of less than one page per second. My server could have easily handled that, but the spidering of my site was on a different order of magnitude. If it's taking down your server, being hit by a "respectable" company's distributed spider doesn't feel much different from a DDoS attack.
I appreciate that 80legs.com's customer service acknowledges that they are at times responsible for overwhelming the servers their customers hire them to target, and that they make some effort to respect robots.txt. But given that they can manually slow the rate at which their botnet hits a website when they receive a complaint, there does not appear to be any reason why they haven't set reasonable default limits for all websites they spider in order to prevent their botnet from ever running amok.