Welcome to WebmasterWorld Guest from 188.8.131.52
First -- do you pass parameters in the URL. If so, drop them and move to a mod_rewrite.
Second -- Do you have a high enough visible toolbar? 5-7?
Third -- Do you have any pages in the daily crawl? If yes, start rotating deep links to uncrawled pages from them. If you don't have pages in the daily crawl, try to keep your front page as fresh as possible and aim for a PR4-5.
Fourth -- Do you have a good on-site nav structure? Good breadcrumbs? A good site map?
Have you memorised the crawling pattern? There are a few things to be learnt from how the bot hits your site. The frequency of homepage visits and the pages collected later...
you should do a "Random Page" thing and each page links to 2-5 of your other pages. Not spammy at all and everything is guaranteed to be eventually seen by Google.
Adding deep(er) links to daily crawl pages will definitely help. Rotating deep links in our sitemap/directory has been a surefire way of helping G find our deep pages. Naturally, having a site structure that is Googlebot-friendly is also important.
We recently streamlined the site structure of one of our portals and noticed significantly higher daily crawls. As a result, site indexing almost quadrupled from 120k -> 450k pages.
Also remember that being crawled doesn't necessitate inclusion in the index.