The two are unrelated. The sitemap is ostensibly to let you communicate to Google where all your pages are in the event they can't find them themselves.
The frequency of spidering is independant of this, and is more to do with backlinks, and your own site structure. Another factor is frequency of updates to your own site e.g. if you're a big news agency adding articles all the time, they'll spider a lot. Crudely put they spider more 'important' material more frequently than 'less important' material.
A sitemap shouldn't be seen as a guarantee of speed or inclusion, as indicated by their guidelines. It's merely there to let them know the stuff exists and its inclusion is determined by other factors.
It is claimed that, for the moment, Googlebot is sharing results from Mediabot. If true, then if you could put AdSense on those uncrawled pages and view them (to stimulate a Mediabot visit) you might get them indexed.