Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Google indexed pages question

         

travisbickle

6:43 pm on Dec 29, 2005 (gmt 0)

10+ Year Member



I went to uptimebot and entered my site and found that Google had only indexed only a couple hundred of my pages. The web site is a database of private schools and I have almost 30,000 pages on the site. The site has been up for almost 5 years. Can anyone tell me why the entire contents have not been indexed after so long?

tedster

6:59 pm on Dec 29, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



There can be many reasons -- perhaps many of the pages Google already has appear to be too similar, for instance. Also, you mention that it's a database driven site. Are all your pages accessible through direct HTML links, or do many of them rely on queries to generate a page on the fly?

At any rate, rather than rely on a third party tool, I would suggest examining the results when you use the site: operator on Google itself. That may give you a better idea as to what is happening.

travisbickle

1:20 am on Dec 30, 2005 (gmt 0)

10+ Year Member



I did go to google and got the same results.

tedster

5:38 am on Dec 30, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'd suggest looking at more than just the raw number of pages. Look througheach of the specific listings that Google returns, and see if the pages are fully indexed, or url-only. Also see if the same "page" is returned more than once with a different url (with and without "www" or with a different query string). Notice if the page titles are unique or contain many duplicates.

That kind of careful, page by page inspection, may teach you something. I know I often find technical issues this way.

Also, I'd suggest looking at your backlinks (even internal) on the Yahoo Site Explorer [siteexplorer.search.yahoo.com] or any other search engine that gives more thorough results than Google's link: operator does. You may notice some oddities this way -- and if Slurp runs into something difficult, then in all likelihood Googlebot will too.

One more thing to check -- if you've got a robots.txt file, make sure it is saying exactly what you want it to say.

g1smd

3:44 pm on Dec 30, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Make sure that every page has a unique title and meta description, otherwise many pages will be filtered out of the results.

Run Xenu LinkSleuth over your site and make sure that there are no problems with the navigation. Make sure too, that all non-www URLs are redirected to www and that the status is "301".

travisbickle

7:31 pm on Dec 30, 2005 (gmt 0)

10+ Year Member



I went to Google direct as suggested and put in:

site:example.com uniquekeyword

and 36,700 results came up instead of 440.

[edited by: tedster at 7:48 pm (utc) on Dec. 30, 2005]
[edit reason] no specifics please [/edit]

tedster

7:53 pm on Dec 30, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



It sounds like most of those pages may be "Supplemental Results"

travisbickle

1:03 pm on Dec 31, 2005 (gmt 0)

10+ Year Member



Tedster...it may be that Google considers the pages so similiar that during a regular search only 400 come up even though 35,000 are actually indexed.