Thanks for all your help everyone.
A follow up question. I'm trying to limit crawling to some extent due to the huge volume of pages. The host I'm using (a local company) has me on a plan limited to 100gigs/month. I bet I've got almost 15 gigs of data online. So if all three search engines scrape me twice a month, I'm almost out of bandwidth.
As a result, I set the crawl-delay to 5. Now I'm wondering if that's going to prevent the site from really being indexed.
On a new site, with many tens of thousands of pages of new content, should I use that crawl delay? Or should I remove it and let them go nuts?