There was some discussion of this last month.
There has always been a lot of variation in how many pages freshbot took from a site - some think it is related to PR or other factors.
However, several people noted a lot deeper crawling by freshbot IP's in the 64 range last month than they had previously experienced. In my case the typical 30-50 became well over 100, but I still had a full deepcrawl by 216.x at the normal time.
The past few days I have also had a ton of freshbot hits.
It may be that Google is doing a better job of - or experimenting with - co-ordinating the 64.x and 216.x crawl results into their database. In the past, freshbot results seemed to be totally wiped out by the update, which never seemed very efficient. I mean, I understand the point of "daily fresh listings" but why delete and re-crawl pages that have not changed since freshbot initially found them?
I haven't analysed my serps for certain pages in detail, or compared the freshbot and deepbot logs entries to see if 216. skips some pages that were recently crawled by 64., but I have a suspicion that Google is starting to use the two bots in concert to update the database. (it would be interesting to see if 304's appear in the 216.x log entry for pages recently crawled by 64.x).
This is based on pages seemingly lodged in the database that were added after the last deep crawl, whereas before the freshbot results seemed to disappear from the index in hours or days at the most.
Also, I echo mfishy and skipfactor - AXS is great for instant log checking. I don't use it on all pages just enough to let me know what's going on at any moment.