Welcome to WebmasterWorld Guest from 54.167.46.29

Forum Moderators: phranque

Message Too Old, No Replies

Issues with large site maps

     
7:13 pm on Apr 29, 2014 (gmt 0)

Junior Member

10+ Year Member

joined:Nov 12, 2003
posts: 198
votes: 0


I have a database driven website with about 170K pages. I have all the urls listed and ready to include in a site maps. I know that you cannot have more than 50K urls per site map and that the site map cannot be bigger than 10 megabytes. Also I have to watch out for duplicate content.

Are there any other issues I need to know? Are there any strategies that can be helpful? Any mistakes to avoid?
9:10 pm on May 11, 2014 (gmt 0)

Full Member

5+ Year Member

joined:Aug 16, 2010
posts:214
votes: 11


Create sitemaps based on your website sections or article categories. Not just 4 big sitemaps. With more sitemaps you can look into GWT and see which section of your website got indexed or removed.
11:01 pm on May 11, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Aug 30, 2002
posts: 2415
votes: 24


It depends on how you can break down the webpages. (Topic/Alphanumerical etc.) The best way might be to maintain a database table of pages, their lastmod date and their priority and generate the sitemaps from this.

Also pay close attention to the highest lastmod date in a sitemap as you can use a sitemap index file for multiple sitemaps. (170K might sound large but when you get to the 1M pages or few hundred million pages, things take some time to generate and anything that can save traffic and unnecessary spidering is good. :) ) With the database table approach, only sitemaps with new lastmod dates need to be regenerated.

Regards...jmcc
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members