Are you 100% sure? Does the server handle these url issues without ever creating a second 200 response?
1. incorrect query strings
2. 404 pages (at both the server level and the platform level)
3. https protocol requests
4. different order of the parameters or virtual folders
5. index.html vs. directory root
6. mixed up letter cases
7. double slashes in the file path
If so, you have a rare creation. I have had many sites come to me for help, and almost all of them have at least one of these issues.
The best way to increase crawl rate, besides telling Google that you want it with your Webmaster Tools account and within your sitemap, is to have a very prominent site, well respected in your field and widely linked to. The more important Google sees your site, the better your crawl will be.
Also, make sure your server is handling if-modified-since requests properly, has not the least little DNS problem, no major server delays, no load-balancing issues, makes proper use of E-tags, and uses compression appropriately especially on text files.
If you've got all that handled on your side, then the ball is in Google's court. You cannot force a more frequent spidering, you can only invite it by building a technically solid site with truly world-class content.