I have a database driven website with about 170K pages. I have all the urls listed and ready to include in a site maps. I know that you cannot have more than 50K urls per site map and that the site map cannot be bigger than 10 megabytes. Also I have to watch out for duplicate content.
Are there any other issues I need to know? Are there any strategies that can be helpful? Any mistakes to avoid?
It depends on how you can break down the webpages. (Topic/Alphanumerical etc.) The best way might be to maintain a database table of pages, their lastmod date and their priority and generate the sitemaps from this.
Also pay close attention to the highest lastmod date in a sitemap as you can use a sitemap index file for multiple sitemaps. (170K might sound large but when you get to the 1M pages or few hundred million pages, things take some time to generate and anything that can save traffic and unnecessary spidering is good. :) ) With the database table approach, only sitemaps with new lastmod dates need to be regenerated.