I think you missed my second point.
The script as presented fixes issue #1 which is the bandwidth hogs
I want to stop competitors from crawling my pages after a few hundred pages or so, instead of letting them get access to 40k+ pages. A human has a limit to how much information they can process in a day, but the offloading and scraping, even at a slow rate just keeps going and going until they have my entire site.
I'm looking into a modification that will track that behavior and just shut them down after a few hundred pages with a nice friendly message like "You have exceeded your page view allotment for today, come back tomorrow you greedy pig"