dmorison - 8:22 am on May 23, 2013 (gmt 0)
Can you tell for sure that the search engine is using the information from your sitemaps and not from its own observation? You said "gained trust in my sitemaps". I've wondered about that occasionally. Is some part of google's computer set aside for comparing site-map statements with observed fact, so after a time it can "decide" whether to trust the sitemap when a new page is listed?
I totally accept your point that an intelligent search engine in conjunction with a well structured website should be able to work it all out for itself - case in point for example; if somebody replies to a 2 year old thread it is, just like here on WebmasterWorld, "bumped" to the top of the forum index page which is crawled regularly, so even if a search engine has known about that URL for 2 years that should be enough of a hint that something might have changed.
Rather, I see LastMod as being more useful to a search engine in terms of saving it from going through the motions of determining a change frequency for itself (a straight forward algo, but does take time and resources). But no, I haven't as yet studied crawling patterns before / after using a dynamic LastMod sitemap so could never say for sure that it makes any difference at all, but that's why I would only do it if it can be done accurately, and if that's the case it certainly can't do any harm.
Regarding the "trust" issue, yes - that's basically the assumption - I wasn't thinking so much in terms of any kind of intentional mis-information from a webmaster - telling porkies in your sitemap isn't going to get anyone anywhere; it's more for the search engine's own benefit to cover situations where perhaps a sitemap has been neglected by a webmaster - not a good idea but i'm sure it happens - they put one up when XML sitemaps were the latest hot topic in SEO and then forgotten about it - leaving it online with quite possibly historic manually crafted LastMod dates that don't even remotely correlate with observed fact!
If a search engine did asses sitemap quality before relying on it too much then it is of course possible that if it sucks then it could be seen as a negative quality indicator of the domain overall, but I wouldn't have thought it would be that highly weighted as one. Googlebot is quite capable of saying to itself - "Hey, great content here. Sitemap is as much use as a chocolate teapot but we'll let 'em off!"