Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

200 new pages / day, how can I get them indexed?

         

jozomannen

10:48 pm on Jan 13, 2006 (gmt 0)

10+ Year Member



I'm currently working with a site that gets about 200 new pages / day. All of them are real pages with .html extension. But since the site is new and doesn't have many links, I think the search engines will have trouble with indexing all of them.

ATM, I'm having a link on all pages that points to a sitemap. The sitemap contain links to sitemap2-10 and 90 of the latest pages. Sitemap 2-10 contains 100 links each.

Does anyone know any ways to get theese large amount of pages indexed?

Small Website Guy

11:39 pm on Jan 13, 2006 (gmt 0)

10+ Year Member



I think that PageRank is an important part of what Google will index. Google will seem to quickly index a page if it finds a new link from a PR5 or better page.

But for new links from low PR pages, Google will take it's sweet time before it follows them.

leunga

10:01 am on Jan 14, 2006 (gmt 0)

10+ Year Member



I agree with advice from Small Website Guy. A link with good PR plays a role. Try getting some good PR links not just for homepage, but also for inner pages if you want them to be indexed quickly. For example, if your site has an A to Z index, then you better have links that links to all these 26 pages. This will accelerate the process as a whole. If you don¡¦t do that, deeper pages will need more PR update events to get the benefit (according to this theory).

Freedom

10:41 am on Jan 14, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



200 new pages a day? Sounds like a scraper or potemkin village website.

What kind of pages are those? High quality original content pages written by experienced copywriters?

peewhy

10:52 am on Jan 14, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



'potemkin village ' - I like it!

It often takes me a day to write a handful of pages from creation to FTP, so 200 pages per day is some going. I wonder if Google queries such mass over a short period?

Wizard

11:07 am on Jan 14, 2006 (gmt 0)

10+ Year Member



You did well with the sitemap, and these links to recently added pages will also help. But in your case, PR will be essential. It's not so important for ranking now, but still it has a lot to do with crawling frequency and chances for page to be indexed quickly and not URL-only.

From my experience, pages with PR below 2 are unlikely to be indexed quickly and tend to drop to URL-only listing. So you have to ensure all new pages get PR2 or higher. Try to estimate the chances to achieve this goal with original PR formula - it's likely outdated now, but you'll get the general idea.

If each sitemap has 100 outgoing links, I guess you need at least PR4 to ensure the new pages will be indexed quickly. So you need all sitemaps 2-10 with PR4-5 each and main sitemap perhaps with PR5-6, so your main page must have about PR6-7. But these are just my guessings, not calculations.

Having inbound links from pages in other domains directly to sitemaps and even directly to new pages would help a lot (assuming these pages in other domains have PR3 or higher, with less it's not worth bothering), but you have to achieve it better way than spamming - spamming means risk that you'll get whole site banned and links from unrelated poor quality pages (which generally spam-vulnerable sites are) are not good for your site.

Deeplinks from quality related sites would be perfect.

jozomannen

9:58 am on Jan 16, 2006 (gmt 0)

10+ Year Member



Thanks for the reply. I've thought about the idea about making RSS-feeds so people can add them to their sites. the feeds should contain the newest pages that I've added. In that way, I would get many deep links. The site isn't very big yet, not that many visitors, so I thought that their may be some site/service where I could upload my rss-feed. How about blogger.com or some site like that? Could I sign up for some free blog accounts (where I get an own subdomain) and add my rss-feeds there?

ScottD

11:25 am on Jan 16, 2006 (gmt 0)

10+ Year Member



If you are adding that much new content a day, and the content is your own, you will quickly get a high PR, especially if it is useful content. If the content is scraped, you will probably get blacklisted.

You should make use of Google Site Maps.

I also recommend using a structure like a "news" site on your home page - Google is used to quickly picking up articles from news sites

Make a logical structure for both Google and Users to find information

Be patient :)

jozomannen

5:24 pm on Jan 16, 2006 (gmt 0)

10+ Year Member



The site is growing pretty fast, it is useful information and I have many returning visitors. So I'm sure it will get a good PR and get many pages indexed eventually. But I want it to happend faster :D

How about that thing with rss-feeds on blogs, like Google's blogger.com, can I add my rss-feed so that kind of services?

jozomannen

11:54 pm on Jan 17, 2006 (gmt 0)

10+ Year Member



What if I add the RSS on another site of mine, that so it will show like the latest 10 entities form the RSS. Would that help a lot or is it just a waste of space in my other site?

bts111

12:29 pm on Jan 19, 2006 (gmt 0)

10+ Year Member



Gee Wiz! You must loads of content writers on your books.