Forum Moderators: Robert Charlton & goodroi
this is what i have done:
1. create a script .asp or .php or something like that which genereates html output with the 50000 latest url's (50000 is the limit according to google's faq about sitmaps).
2. i have a site with user generated contents which each have an id so each line of the output has http://example.com/<<id>>
3. this means that google should find the 50000 latest urls each time it asks.
4. will google give up if it takes too long to generate? it takes maybe 20-30 seconds when i try it.
5. is this a good way to use sitemaps? i think i have got it but what do you say?
will this have any impact on my crawling? i have links to all these pages (they link to "previous" and "next" in a "linked list") so they shoulde eventllay be found by google i think
maybe this will speed it up who knows?
what do You say?