Forum Moderators: Robert Charlton & goodroi
ATM, I'm having a link on all pages that points to a sitemap. The sitemap contain links to sitemap2-10 and 90 of the latest pages. Sitemap 2-10 contains 100 links each.
Does anyone know any ways to get theese large amount of pages indexed?
From my experience, pages with PR below 2 are unlikely to be indexed quickly and tend to drop to URL-only listing. So you have to ensure all new pages get PR2 or higher. Try to estimate the chances to achieve this goal with original PR formula - it's likely outdated now, but you'll get the general idea.
If each sitemap has 100 outgoing links, I guess you need at least PR4 to ensure the new pages will be indexed quickly. So you need all sitemaps 2-10 with PR4-5 each and main sitemap perhaps with PR5-6, so your main page must have about PR6-7. But these are just my guessings, not calculations.
Having inbound links from pages in other domains directly to sitemaps and even directly to new pages would help a lot (assuming these pages in other domains have PR3 or higher, with less it's not worth bothering), but you have to achieve it better way than spamming - spamming means risk that you'll get whole site banned and links from unrelated poor quality pages (which generally spam-vulnerable sites are) are not good for your site.
Deeplinks from quality related sites would be perfect.
You should make use of Google Site Maps.
I also recommend using a structure like a "news" site on your home page - Google is used to quickly picking up articles from news sites
Make a logical structure for both Google and Users to find information
Be patient :)
How about that thing with rss-feeds on blogs, like Google's blogger.com, can I add my rss-feed so that kind of services?