Welcome to WebmasterWorld Guest from

Forum Moderators: phranque

Message Too Old, No Replies

Issues with large site maps



7:13 pm on Apr 29, 2014 (gmt 0)

10+ Year Member

I have a database driven website with about 170K pages. I have all the urls listed and ready to include in a site maps. I know that you cannot have more than 50K urls per site map and that the site map cannot be bigger than 10 megabytes. Also I have to watch out for duplicate content.

Are there any other issues I need to know? Are there any strategies that can be helpful? Any mistakes to avoid?


9:10 pm on May 11, 2014 (gmt 0)

5+ Year Member

Create sitemaps based on your website sections or article categories. Not just 4 big sitemaps. With more sitemaps you can look into GWT and see which section of your website got indexed or removed.


11:01 pm on May 11, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

It depends on how you can break down the webpages. (Topic/Alphanumerical etc.) The best way might be to maintain a database table of pages, their lastmod date and their priority and generate the sitemaps from this.

Also pay close attention to the highest lastmod date in a sitemap as you can use a sitemap index file for multiple sitemaps. (170K might sound large but when you get to the 1M pages or few hundred million pages, things take some time to generate and anything that can save traffic and unnecessary spidering is good. :) ) With the database table approach, only sitemaps with new lastmod dates need to be regenerated.


Featured Threads

Hot Threads This Week

Hot Threads This Month