homepage Welcome to WebmasterWorld Guest from 107.22.70.215
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor
Visit PubCon.com
Home / Forums Index / Search Engines / Sitemaps, Meta Data, and robots.txt
Forum Library, Charter, Moderators: goodroi

Sitemaps, Meta Data, and robots.txt Forum

    
Sitemaps for blogs and frequently updated sites?
DiscoStu




msg:4115907
 7:01 pm on Apr 14, 2010 (gmt 0)

What do you do when you have a website that updates very often, perhaps several times a day - do you generate a new xml sitemap each time something is posted and upload it to Google (so you upload several new versions of xml sitemaps a day)?

And what about large sites (say 30K pages) that add content. Should the whole xml sitemap be re uploaded each time a page is added?

 

not2easy




msg:4116881
 2:10 pm on Apr 16, 2010 (gmt 0)

Your sitemap should be regenerated whenever a page is updated, blog sitemaps regenerate for every change too. The xml sitemaps don't need to be uploaded to Google, they should be generated on your site and hosted on your site with their URI (s) listed in the robots.txt file so bots know where to find your sitemaps.

If you are manually generating your sitemaps, look around for one of the many solutions. A site can have several sitemaps for different kinds of content as many sitemap generators skip dynamic pages or static pages, depending on what they are designed to do. Just list each sitemap in the robots.txt file.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Search Engines / Sitemaps, Meta Data, and robots.txt
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved