Welcome to WebmasterWorld Guest from 23.22.46.195

Is it ok to submit sitemap with 200K+ urls with cached paged to 19K

   
9:28 am on Oct 23, 2006 (gmt 0)

10+ Year Member



hi ,
One of our site we have added 1000s of categories and products in last 2 weeks and because of that our no of pages in the website have increased from 19,000 to 200,000. I have generated google sitemap for those 200,000 URLs.
I am just bit worried before submission of this sitemap. I need an expert help to know that wether google can treat my all of sudden increase in no. of pages from 19,000 to 200,000 as spam. As i have not done for spamming but we have categoried our site in more user friendly manner. Plus we have added category for search terms which have given us 1000s of pages. We have done all this for users friendlyness and also by doing this i can help us in internally backlinking. Since last two week i have seen in my log file that googlebot activiy have increased and on googlewebmastertool i have seen on an average googlebot is indexing 5,000-6,000 pages per day.

Please any of you can give an advice on it.

Thanks
vikram

12:57 pm on Oct 23, 2006 (gmt 0)

10+ Year Member



hi,
So may i consider that you all experts here think there is no harm in uploading sitemap with 200,000 URLS

please atleast confirm

thanks

1:29 pm on Oct 23, 2006 (gmt 0)

10+ Year Member



Vikram,

200k is a lot to put into a sitemap. Google allows webmaster add multiple sitemaps for the same website. Maybe you should split the sitemap into different sections.

The Google sitemap is meant to aid the spider find your pages. Make sure that these pages can be accessed from your homepage through links. Otherwise Google may see these pages as search result pages loaded into a sitemap.

Donal

1:42 pm on Oct 23, 2006 (gmt 0)

5+ Year Member



Maybe you can learn more by reading this: [googlewebmastercentral.blogspot.com...]

(Moderators: I hope it's okay for me to post a relevant link here?)

[edited by: OutdoorMan at 1:42 pm (utc) on Oct. 23, 2006]

3:21 pm on Oct 23, 2006 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Thanks for the link, OutdoorMan. It offer the definitive answer to this Google Sitemap question

If each store includes more than 50,000 URLs (the maximum number for a single Sitemap), you would need to have multiple Sitemaps...
3:45 pm on Oct 23, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Maybe you should be thinking more about the impact of the sudden increase in the number of pages compared to what currently exists and the prior url growth and change rates.

Adding 180,000+ pages to a site with 19,000+ is a large change.

11:12 am on Oct 24, 2006 (gmt 0)

10+ Year Member



hi everybody,
Its great that everyone of you have tried to contribute, but i think most of you have mis-understood my question. My question is, Is it ok to submit site map with sudden increase in urls from 19,000 to 200,000 urls. I have already created multiple sitemap.xml files with sitemap index file too.

My problem is the sudden increase in URLs. Along with this I have noticed that google is indexing all new pages daily. But the site:mysite.com is varying each day, sometimes it is showing most of the newly cached page and sometimes not at all. is this a normal behaviour of google or just doing it with me.

Please if anyone can reply to my query

thanks
vikram

2:24 pm on Oct 24, 2006 (gmt 0)

5+ Year Member



If your site has a very high trust rank, maybe. If not, then I would say no from the sites that I have seen try this. Usually they start indexing the pages then all of sudden it just stops and you take a hit in ranking. Most sites have this happen but I've seen sites with a high trustrank and that have been up for a very long time get away with it, but its rare.
4:48 pm on Oct 24, 2006 (gmt 0)

5+ Year Member



And how do you know if your site has a high trust rank, or not?
6:47 am on Oct 25, 2006 (gmt 0)

5+ Year Member



What about letting google find your new pages on its own without giving it a fright with a huge sitemap.
7:45 am on Oct 26, 2006 (gmt 0)

10+ Year Member



What about letting google find your new pages on its own without giving it a fright with a huge sitemap.

Ok Great ,
this seems to be better option,let google find new 180,000 urls on its own.

Please if anybody not agree with this let us know.

thanks

[edited by: tedster at 5:47 pm (utc) on Oct. 29, 2006]
[edit reason] fix formatting [/edit]

 

Featured Threads

Hot Threads This Week

Hot Threads This Month