Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Wildcard subdomain on sitemap

         

force123

2:53 am on Jun 24, 2009 (gmt 0)

10+ Year Member



Hi,

After 4 - 5 failed attempts on submitting my xml site map to google. I found out that I can't put any subdomain urls in the www.example.com/sitemap.xml. and I read that each subdomain must have its own sitemap.

Now I'm using wildcard subdomains. There are no actual folder for my subdomains.

How can I make google accept my urls?

tedster

6:55 am on Jun 24, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



You've got to create and submit a sitemap for one specific subdomain at a time. If you are changing your subdomain to an infinite number of possible strings, you will not be able to submit those infinite possibilities.

Let me point out the obvious - your server is pulling the content for those urls from somehwere when it responds to a request, so there should be some resource you can use to list the most important urls for each subdomain.

Perhaps I misunderstand the situation - do you mean you are using subdomain roots instead of any dirrectory/page/querystring structure at all?

force123

2:10 pm on Jun 24, 2009 (gmt 0)

10+ Year Member



I have this big posts archieve which is sorted by the topic city. (Members have option to choose the city)

The system put the city as the subdomain :
city1.example.com/post235/

The cities to choose ar emore than 1000. So technically I have 1000 subdomains.

tedster

6:53 pm on Jun 24, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



As far as I know, you will need 1,000 sitemaps for this structure. Are these archived posts not in Google's index currently?

force123

3:47 am on Jun 25, 2009 (gmt 0)

10+ Year Member



They were. But since june 2009 google dropped like 90% of them.
and my page visits reduced from 35, 000 per day to like 4, 000 now.

tedster

4:54 am on Jun 25, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



So the issue isn't that Google doesn't know about these urls - they do know but decided to drop them. It doesn't sound to me like even 1,000 sitemaps will change that situation. You could experiment with sitemaps for one or two subdomains to see if it makes any difference, but my best guess says no.

You may need re-think the url structure (e.g. switch to subdirectories) to get enough juice flowing into these urls that Google just might pick them up again. I'm sure that will be a lot more work. Depending on what the return is for you, it may or may not be worth the effort.

Just a thought - you say you have wildcard subdomains, right? Does that mean that even if you make up a subdomain (cityname) that doesn't exist your server will still serve something - maybe even a duplicate something? If so, you might want to back away from that configuration and only have specific, "real" subdomains resolve. Otherwise it looks like a subdomain spammer profile and that may be part of the problem.

force123

1:36 pm on Jun 27, 2009 (gmt 0)

10+ Year Member



When I started to work on this site (Site is not mine), In its web master tools, Google had sent 3 mails saying "You have extremely large number of links". I was a newbie with google that time, and I didn't care about. All this time that site improved its ranking, These emails were there. (About a year) That's why I guess we never cared about our large duplicate content. (Cause the warn was there about 1 year and we kept improving)

But now google have dropped almost all of the subdomain urls and still keeps those old ones.
I have to change all these "subdomain" urls and change them into something else.

like example.com/city1/post235/

Btw how much a sitemap can improve your ranking IF google can index you itself?

tedster

1:40 am on Jun 28, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



A sitemap cannot improve rankings for urls that are already indexed via some other route. It can increase the speed with which changes get indexed, however.