I don't use XML sitemaps, ever. However, I have noticed something odd in WMT reports over the last month or so.
At the beginning of the year, I blocked using
robots.txt one type of page (i.e. one parameter name with any value), on a dynamic site, with something like
Disallow: /*something= or similar.
Within days WMT reports listed many thousands of these URLs as being blocked by robots.txt. However, in the last month the number has declined, and is now down to only a few hundred. The URLs are still disallowed in robots.txt and no internal linking on the site has been changed in recent months.