if the last modified value is set, then search engine spiders and browsers know whether they need to fetch the page again, or whether they can display the cached version - this is very useful for regular visitors to the site and for search engine spiders: e.g.
you have a 10,000 page site, googlebot comes in, and without the last modified date, it doesn't know which pages have new content and which do not - so it spiders higgledy-piggledy. (i think with the new sitemap function of google's you could probably get round this, by having a separate script which updated the sitemap with all newly changed pages).
however, the more complicated the page gets - i.e. the more database calls and includes you use on each page - the more complicated it gets to work out the last modification time, this is why most big dynamic sites don't bother sending it.
we have a relatively big site and we refresh the last modified time every 6-12 hours,. depending on the page. that seems to me to be a good enough compromise between cacheability and ease of code.