Forum Moderators: open

Message Too Old, No Replies

Limit of Pages

maximum pages

         

Albaba

7:53 am on Feb 6, 2003 (gmt 0)

10+ Year Member



Hi All

I have new site and its about 150k pages.
my question is googlebot will crawl all this pages (150k)?
what's your experience with huge pages?

thank for your attention

PS: now googlebot deep crawl my site

vitaplease

7:56 am on Feb 6, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Welcome to WebmasterWorld!

150.000 pages is a good chunk indeed..

Check some remarks made in this older thread:
[webmasterworld.com...]

Basically the higher the pagerank and the more deeplinking to inner pages the more pages get indexed.

Also the type of links play a role. Plain static links go better than dynamic in general.

Macguru

8:00 am on Feb 6, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Hi Albaba,

>>what's your experience with huge pages?

If your are thinking big page size, Google has a 100k limit per page (only the code, no images). vitaplease is right about the rest and provided a good link.

Welcome aboard!

Albaba

8:10 am on Feb 6, 2003 (gmt 0)

10+ Year Member



hi Vita Thanks for respond :)

my site 3 tree

index-->>subindex--->>content

the deepest pages have url like this

mysite.com/index.php?name=abc

what make me happy is some of deep pages was listed by freshbot. is it good sign? :D

and is it possible all my pages will listed in next dancing?

thanks

vitaplease

8:55 am on Feb 6, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Do a google search for: allinurl: php

and when you check the SERP's lower down, you will see that many such pages are indexed by Google.

and is it possible all my pages will listed in next dancing?

I think that would be asking a bit too much.

You say your site is new, therefore its Pagerank most probably will be very limited.

Get many high quality links towards your site, be patient for a couple of updates.

jomaxx

4:25 pm on Feb 6, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If the pages are all differentiated by a parameter after the "?" in the URL, then IMO Googlebot will take many cycles to index them and will probably never index them all. It prefers "static" pages.

bcc1234

4:42 pm on Feb 6, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



For PR6 google seems to go at least 8 links down if the pages look static.

vitaplease

5:03 pm on Feb 6, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



150k pages.

Google has indexed 3.000.000k Pages

or in other words if Google would index 20.000 sites like yours, that would be all.

Do not expect too much too soon.