homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Search Engines / Sitemaps, Meta Data, and robots.txt
Forum Library, Charter, Moderators: goodroi

Sitemaps, Meta Data, and robots.txt Forum

Can I have multiple sitemap.xml's for Google?
multiple sitemaps xml html

Msg#: 4537852 posted 2:48 am on Jan 21, 2013 (gmt 0)

Hi. Like most of us I have many different alias domains all directing to the same site with keyword rich aliases to help drive more traffic. So I have & want different xml sitemaps for each alias versions that I want Google to index. I keep the main one in root, and the rest in a subfolder called sitemaps. I'm getting "Unsupported file format" ERROR. Very nubulous error with a link to information that also doesn't help. What's the usual reason/solution for this error "Unsupported file format" when it's already (I think?) in the right .xml format, created properly with, in this case, Gsitecrawler's tool. My main one is getting indexed, but all the rest in the subfolder are getting either this error or saying that they are "html" (but they're .xml). I have to get this fixed. Thanksfor taking the time to help me out.



10+ Year Member

Msg#: 4537852 posted 3:00 am on Jan 21, 2013 (gmt 0)

Welcome to WebmasterWorld 2kegx!

Try adding two separate sitemap files in Google Webmaster Tools. Google does give a bit more error feedback there in my experience. There are a number of online tools to generate them.

The most common usage of multiple file reported here is on huge sites that exceed the 50,000 entries per file limitation. Others have used them in hopes of enhancing deep page indexation. Some have used them to resubmit pages recently moved.


Msg#: 4537852 posted 3:55 am on Jan 21, 2013 (gmt 0)

Thanks Hoople. Yes, I have them all in Google Webmaster Tools. That's where I'm getting the error. They are only a few hundred entries max per sitemap. I wonder what's going on.


WebmasterWorld Administrator phranque us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

Msg#: 4537852 posted 9:26 am on Jan 21, 2013 (gmt 0)

welcome to WebmasterWorld, 2kegx!

have you validated your sitemap?
what Content-Type header is sent with the response when you GET your sitemaps?


Msg#: 4537852 posted 10:32 am on Jan 21, 2013 (gmt 0)

Thanks Phranqe. I found the problem. Basically, I'd had a web.config file that was working on conjunction with a php file to privatize various extensions (eg any .txt, .xls, .xml files would get a 404 error. So when I tried to submit those .xml sitemaps, they got redirected to the 404 "html" which is why Google kept calling them the wrong format. Following deletion of the ".xml" part of those codes/arrays, all sitemaps submitted! :) Only bummer is now everyone/competitors can see all my keyword rich URLs and copy them and I put alot of work into them. Is there another solution to block competitors from seeing those sitemaps? I know the usual methods like "don't call'em sitemap and hide'em various places, but looking for some other way I guess.

In the meantime, since I'm talking about my web.config file (indirectly) since I'm IIS7, I'm getting a lot of unwanted traffic from a particular blog and I want to block that domain altogether from my site. How can I add an entire domain (without doing IPs preferably) within the web.config?

Global Options:
 top home search open messages active posts  

Home / Forums Index / Search Engines / Sitemaps, Meta Data, and robots.txt
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved