Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Getting G to crawl 300000 pages?

It's been 4 yrs and only at 88000

         

mrSEman

1:13 am on Dec 1, 2005 (gmt 0)

10+ Year Member



I have a site with over 300 thousand pages. Obviously it is db driven and alows visitors to add content to their respective country/stat/city/widget/widget
Anyways, the site has been up for over 4 years now and G has only indexed 88000 pages and most of them as URL's. The site is fully crawlable and more and more people are adding content but it's very slow in growing if they cannot find it on the SE.

I have thought about a sitemap but there are just too many pages and I estimate it would take a week to build the sitemap without knowing if it would actually work.

Any tips on getting G to speed up its crawling?

mrSEman

1:17 am on Dec 16, 2005 (gmt 0)

10+ Year Member



anyone?

BadSense

1:25 am on Dec 16, 2005 (gmt 0)

10+ Year Member



You sure Googlebot is making it through the site okay?

Stefan

1:29 am on Dec 16, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I have thought about a sitemap but there are just too many pages

A 300,000 page sitemap, eh? Yep, that would be tricky.

Man, you have 88,000 pages listed. Are there that many variations on your intended kw's and topics? We're talking Wikipedia territory here.

Anyway, good luck on getting the other couple of hundred thousand pages listed.

walkman

2:05 am on Dec 16, 2005 (gmt 0)



>> alows visitors to add content to their respective country/stat/city/widget/widget

what % of pages actually has unique information added? Not talking about "Click here to see (city) keyword" type of text. If the text is 99% similar, getting indexed will not help, as you will not rank.

Other than that, you can make a little script that displays a few links on each page, this way all the pages have a few links to them.

Ledfish

5:35 am on Dec 16, 2005 (gmt 0)

10+ Year Member



If you have 300K pages of actually valuable and unique content......you should be able to hire a team of top SEO talent to solve this problem for you.

If you can't afford them, I would have to question the usefulness of the pages content.

texasville

11:17 pm on Dec 16, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Have you tried running xenu or other link sleuth to make sure that every link is navigable?
You might try to run an automated site map maker. google it.

molsen

11:23 pm on Dec 16, 2005 (gmt 0)

10+ Year Member



Try a Google SiteMap. You'll need to break them down into 50,000 URLs per sitemap and use a SiteMapIndex to reference the smaller maps. It's helped with a client's site that has about 1.2m pages (database driven, unique content) get indexed up to about 900k. The total count is in constant flux, current on the upswing.

SiteMap URL: www . google . com /webmasters/sitemaps/
(take spaces out)

rj87uk

11:31 pm on Dec 16, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Links. Links. Links.

I would say you need to get a lot of links, to each section of your website. Since you have got that amount of "good" pages you shouldn't have any problem getting many one way links. Many, many, many deep links.