Forum Moderators: open

Message Too Old, No Replies

Results 171 - 174 of about 464.

is the last page of google's SERP for my site

         

leanweb

6:40 am on Jul 20, 2003 (gmt 0)

10+ Year Member



so google shows that it has about 464 pages indexed on my site. but it will not let me page through all of them. i can get as far as 174 by hitting "next". why wouldnt it let me go further?

Dolemite

6:48 am on Jul 20, 2003 (gmt 0)

10+ Year Member



Try adding &filter=0 to the end of the URL, if its not already there...that might show you a few more. Chances are you already clicked on the "repeat the search with the omitted results included" link which does just that, though.

Flippi

9:36 am on Jul 20, 2003 (gmt 0)

10+ Year Member



Yes, set the filter to &filter=0 . But also now you will have not all pages you expect. The result is not the number of pages found, it's the number how many times the search expression matches.

rfgdxm1

2:34 pm on Jul 20, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



174 is "about 464" to Google. That about is just a guess.

leanweb

4:24 pm on Jul 20, 2003 (gmt 0)

10+ Year Member



thanks for your comments!

i did add filter=0 in my original query.

as far as "174 is about 464 to google" thats not very precise, is it?

g1smd

5:31 pm on Jul 20, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



With 3 billion web pages indexed I don't think they have time to count every single page on every single query. It is a figure in the right area.

I would have issues if it said 20 000 then returned less than a dozen though, but 50% out isn't a lot.

rfgdxm1

5:49 pm on Jul 20, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



As g1smd wrote, my guess is that Google only cares to be ballpark accurate. Also, how many people actually click through all the pages to the end? Few know, or care, how far the initial guess is.

leanweb

1:34 am on Jul 21, 2003 (gmt 0)

10+ Year Member



well, my good friends, allow me to disagree. if google cant accurately count pages, how can we trust it to even find relevant pages?!

i suspect there is a better explanation here, because we all know google to be an accurate SE.

regardless, what i'd like to know is how many pages has google indexed on my site. which is, you'll agree, an important metric. and with respect to that 150 and 450 are worlds apart.

regards,
leanweb

IITian

1:54 am on Jul 21, 2003 (gmt 0)

10+ Year Member



try
site:www.yourdomainname.com -xhreirt

It will list out all pages from www.yourdomainname.com which don't contain the string 'xhreirt.' meaning all the pages in most cases.

leanweb

2:36 am on Jul 21, 2003 (gmt 0)

10+ Year Member



IITian,

thanks for the tip.

still "Results 171 - 174 of about 464"

may be i'm simply misinterpreting what google is doing and they in fact do have 464 pages?

oooh, when to expect the next dance?

olwen

3:41 am on Jul 21, 2003 (gmt 0)

10+ Year Member



I've been noticing this I get 1-6 of about 11 without filter=0 and 1-7 of about 11 with filter=0.

Net_Wizard

3:49 am on Jul 21, 2003 (gmt 0)



How about 350 out of 1170 on my case? :(

I agree that there's more explanation to this than just Google guestimating the number of pages for your site.

The higher number does closely correlate to the number of URL Googlebot have visited. So, for Google only to display 350 URL is a bit of mystery unless of course the missing URLs have yet to go through some kind of process before being release to the public.

It's like they have those exact number of URLs crawled but still waiting to be incorporated to the public database. Maybe, some filtering and recalculation for backlinks and PR.

Like sometimes you would see results such as...

yourdomain.com/deepfolder/exotic-widget.html
Similar pages

note the missing Title and Snippet and/or Cache

leanweb

4:48 am on Jul 21, 2003 (gmt 0)

10+ Year Member



i vaguely recall seeing somewhere wireframe of google internal search algorythm, and (again, vaguely) i recall that there was a two-stage process in which google would scoop only a portion of pages, and then refine search interatively until it's happy with relevance of collection of links it figured. that may have something to do with these discrepancies.

does anyone recall this document that i'm referring to?

bottom line i'd like to know how completely my site gotten spidered by google and whether i need to be doing something to get google to spider the whole of my site sooner.

cheers,
leanweb