After a recent discussion in the Google forum on how to better entice the Googlebot, I realized that my dynamic URLs were too long for Googlebot to handle. So, I changed them. The very next day, the Fast bot stops by and asks for all the old links that I just changed!
I run a database-driven site with tens of thousands of individual records, with a churn of about 10% per month (ie a couple of thousand records drop out, and about the same number are created)
The site has dynamically created pages containing links to all those individual pages. Instead of using the query string these links are in the form www.mydomain.com/BlahXXXXX.htm where XXXXX is a numeric unique record identifier. These are handled by the error page to redirect to the appropriate record.
I have just had all these pages successfully indexed by Fast (they do all have genuine individual content!).
A handful of these pages have recently shown up in Google as temporary, dated items (freshbot; pages now dropped again). Now waiting for the google update to see if it works there too.
In a week or so I will be in a position to make a comparison!