|SiteMaps + Large site =?|
Indexing large sites with sitemaps
| 9:04 pm on Sep 19, 2006 (gmt 0)|
I have several very large sites (200,000 to 2,000,000 pages of unique-ish content). With my largest one (the 2 million pager) I've deliberately not linked to all pages out of fear that Google will consider the site too big and drop it. Is that a rational fear? And if so, would having a Google Sitemap for the whole thing be potentially detrimental? I have a decent amount of referrers to lose out of this.
| 10:06 pm on Sep 19, 2006 (gmt 0)|
According to Matt Cutts
the release of more than 5,000 pages a week will raise a flag and likely not index.
If it does, at this rate, it will take 400 weeks to release a 2M page site.
My guess is that if the site is of exceptional value, Google might take notice and release it, if you contact them, but you'll need to be something like Wikipedia to get away with it.
[edited by: Whitey at 10:07 pm (utc) on Sep. 19, 2006]
| 7:06 am on Oct 9, 2006 (gmt 0)|
Excellent link, thank you.
My site is 2 years old though. Matt Cutts is talking about when you launch. How bout a whole lot of content that GOogle wouldn't have found before?
| 8:50 am on Oct 9, 2006 (gmt 0)|
I opened a members only section to the general public and launched 16.000 'pages' in one week as the spider finally got access to hose as well. They all got indexed in one week time.
As far as the sitemap, I submitted pages that have all the content links added and that seems to have done the job pretty well. Mind you, these link pages are accessible for general users as well.