That's been my experience as well. There simply are more (what can we call them?) "safety routines" that the crawling software must run to keep a dynamic crawl out of trouble -- and this can have a very noticable effect.
Well, yes you can. But even better, you can write a database driven application that generates REALLY, TRULY static pages and gives each one the file name you choose. Do you really need a data lookup for every single page request on an e-commerce site? Maybe, in some specialized cases, but not most of the time.
But, if you have the patience for googlebot to eventually crawl everything, and you don't need to make frequent content updates to existing page, then you probably will do fine -- eventually -- with regular dynamic urls.