What do you do when you have a website that updates very often, perhaps several times a day - do you generate a new xml sitemap each time something is posted and upload it to Google (so you upload several new versions of xml sitemaps a day)?
And what about large sites (say 30K pages) that add content. Should the whole xml sitemap be re uploaded each time a page is added?
Your sitemap should be regenerated whenever a page is updated, blog sitemaps regenerate for every change too. The xml sitemaps don't need to be uploaded to Google, they should be generated on your site and hosted on your site with their URI (s) listed in the robots.txt file so bots know where to find your sitemaps.
If you are manually generating your sitemaps, look around for one of the many solutions. A site can have several sitemaps for different kinds of content as many sitemap generators skip dynamic pages or static pages, depending on what they are designed to do. Just list each sitemap in the robots.txt file.