Hi shiondev and welcome to the spider forum but I think there's still some oversights in your explanations.
I took those stats directly off your website so you might want to update the information if you don't want to give people the wrong idea.
Even if you can access up to 200K machines per day that's still an average 10K pages per machine to reach your stated goal of 2B pages per day and any home computer crawling 10K pages per day is running a good chance of being blocked by many of our webmasters here.
You might want to rethink the liability of causing someone's computer to be banned by various services for abuse.
How do you know if my bandwidth cap is 2GB/mo, 5GB/mo or 250GB/mo?
You can't possibly know which plan I purchased just because of the network I'm using.
If you start costing someone bandwidth charges I would expect trouble to quickly follow.
Here's a quick FOR INSTANCE of where you'll get burned unless you test the network being used each time such as a home computer using Comcast (250GB cap) until there's a network outage and it switches to Sprint Broadband (5GB cap) or likewise a laptop that anchors at home on Comcast, travels with Sprint, and uses various pay to play wifi spots.
Here's a link for other webmasters to get the FAQ specifics [80legs.pbworks.com] of 80legs:
So let me get this straight, no matter what customer requests the crawl you only show "008" so we don't know who's requesting you to crawl on their behalf?
Bad idea, that will get you blocked on principle alone by many webmasters.
BTW, what happens if someone installs Digsby on their office machine and then 80legs crawls all sorts of NSFW sites, or so many sites that the company thinks the employee is doing nothing but surfing the web and fire them for goofing off?
Also, could you imagine someone trying to explain to the police sting operation why their computer was attempting to access illegal sites (nude kids?) that your software inadvertently crawled?
Another consideration is whether you're only company using the Digsby network so if someone else uses the same network to crawl that you're using, and aren't nearly as nice about it, they could easily cause your business to be hobbled unless you're the exclusive crawling agent on the Digsby network.
Last but not least, after reading your wiki it sounds like you crawl a domain per customer request and don't share the cache the pages among multiple customers, correct?
If you had 20 customers all requesting the same site to be crawled at the same time, would 80legs actually crawl the site 20 individual times?
If so, there will be some major screaming from webmasters.