Forum Moderators: Robert Charlton & goodroi
One of our sites is in the following form:
domain.com/default.asp?pid=15&gid=264371&la=2
Values are:
pid=3 (actually more pages, but only 3 are of importance for now)
gid=5 million (worldwide regions)
la=20 (languages)
So we are talking about submitting :
3x5,000,000x20=300 million pages
All pages include absolutely original content (regional in nature) which updates every 12 hours or so.
We want Google to index them all :)
Anybody thinks this is possible?
Thanks
I have just finished writing the specs of a smart sitemap.xml updating routine where only sitemaps referencing changed pages are updated, thus improving the chances of changed or new pages getting crawled.
As a matter of interest, how long does your sitemap.xml generation process take to run?
And that's the kicker. It's not the number of pages, it's the authority of the site that will increase the probability of more pages being indexed. I've got a dead site with 32,000 pages, but only 280 indexed :).
The only thing I would suggest is you look into something like pingomatic. Perhaps that will help increase crawling.
So we are talking about submitting :
3x5,000,000x20=300 million pages
Call me cynical but just how is this possible:
All pages include absolutely original content (regional in nature) which updates every 12 hours or so.
Meaning:
600 million pages per day or
4,200 million pages per week or
18,000 million pages per 30 day month or
219,000,000 million pages per year
I suggest you read this:
You need your own search engine:-)