we have multiple site maps and we follow the sitemap protocol, the sitemap-index.xml was submitted within 48hrs of the tool being available on Webmaster Center, quite a while ago, i have updated the maps since then and on every update re-submitted the index.xml file by pinging the correct url.
I've been keeping an eye on this for an age now and MSNBot will only request the index file of the site maps it doesn't follow the links on that map, never has it requested the sitemaps below.
to me it shows the the inadequacy of MSN's crawler technology if they can't follow a list of links on a page that's especially designed for their robot how can they crawl the web, they've indexing skills are as useful as a chocolate fire guard.
My page totals in the last few weeks have been falling, i'm still ranking and what little traffic msn and live are sending me hasn't changed but i'm still resisting the urge to submit the site maps below the index page for the sole purpose of a test. i see it as their job to fix and i'm not dependent on them for revenue.
Anyone else seen this happening.