Welcome to WebmasterWorld Guest from

Forum Moderators: goodroi

Message Too Old, No Replies

Sitemaps for blogs and frequently updated sites?

7:01 pm on Apr 14, 2010 (gmt 0)

Full Member

5+ Year Member

joined:Aug 28, 2007
votes: 0

What do you do when you have a website that updates very often, perhaps several times a day - do you generate a new xml sitemap each time something is posted and upload it to Google (so you upload several new versions of xml sitemaps a day)?

And what about large sites (say 30K pages) that add content. Should the whole xml sitemap be re uploaded each time a page is added?
2:10 pm on Apr 16, 2010 (gmt 0)

Moderator from US 

WebmasterWorld Administrator 5+ Year Member Top Contributors Of The Month

joined:Dec 27, 2006
votes: 48

Your sitemap should be regenerated whenever a page is updated, blog sitemaps regenerate for every change too. The xml sitemaps don't need to be uploaded to Google, they should be generated on your site and hosted on your site with their URI (s) listed in the robots.txt file so bots know where to find your sitemaps.

If you are manually generating your sitemaps, look around for one of the many solutions. A site can have several sitemaps for different kinds of content as many sitemap generators skip dynamic pages or static pages, depending on what they are designed to do. Just list each sitemap in the robots.txt file.