Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

How to get Google to deep crawl a new website?

and index deep pages.

         

rj87uk

10:58 am on Mar 1, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Hey Guys,

The website I have is around four months old, and the front page ranks really well.

It has many internal pages that are high in quality.

I have been getting quality links to the internal pages.

However the problem is that Google does is not putting these pages in its index, some random pages are.

It has many quality links to the home page.

The pages are unique & good for users.

So does anyone have any tips to help Google index more deep pages?

RJ

phranque

12:20 pm on Mar 1, 2007 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



are you using google webmaster tools and the sitemap protocol?

rj87uk

12:51 pm on Mar 1, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



webmaster tools and the sitemap protocol

I decided I didn't need nor want to use it, I feel the website has got a very good structure - most/all the pages can be found in two clicks from the home page.

Links are static simple html links.
Good internal links from the content linking into other pages.

Its like only my home page is "trusted", Is the trust rank based per page and not per site?

Green_Grass

12:56 pm on Mar 1, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I got a deep crawl only when I verified my site with Google in the Webmaster tools. Now all pages are in index.

Just tell G , about the site by using the verify feature. It should help amd is easy.

Site map is secondary.

Hope this helps.

rj87uk

1:07 pm on Mar 1, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Just tell G , about the site by using the verify feature.

Should I need to? I mean if it has quality links to the home page and deep links then Google should have everything it needs to find all the pages.

I think it has something to do with the age of the website rather than a tech issue. How old was your website Green Grass when you used the tools & how long before being added to the index (what index? sup / main).

I am also interestest - did you website have links, deep links, quality content etc?

Green_Grass

1:16 pm on Mar 1, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Well ..it was 4 months old, when I verified it and all pages got indexed very quickly. Only two pages went supplimental ( on analysis, maybe they deserved it ;-) )

No , this site did not have many high quality inbound links..

amanS

1:22 pm on Mar 1, 2007 (gmt 0)

10+ Year Member



I launched a new site at the beginning of February. This site is a part of a bigger site but under a subdomain (subdomain.example.com) with a top domain PR8.

The new site contains 3000+ files but since February 7th, only 500 files show up in the search results.

We verified in the webmaster tool as well as submitted XML sitemaps with 3000+ links. The sitemap file is crawled multiple times daily.

The site is spidered a few times a week but the spider doesn’t go very deep.

I suspect the parts of the site getting spidered (as opposed to the XML sitemap) is what is showing up on Google. The problem is that it only shows about 500 out of the 3000+ files.
I don’t think the XML sitemap is helping much, for now.

rj87uk

1:34 pm on Mar 1, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



<added>

Took the advice, signed up and will post here my findings if anyone is interested.

Also first time i heard about this operator:
info:www. example .com

</added>

phranque

1:36 pm on Mar 1, 2007 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



i submitted a sitemap for a 1000 page site in which nearly all pages are essentially inaccessible except through javascript (clicking on map icons, for example) and the entire site was indexed in a day or two...

rj87uk

1:46 pm on Mar 1, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The rate at which Googlebot crawls is based on many factors. At this time, crawl rate is not a factor in your site's crawl. If it becomes a factor, the Faster option below will become available.

"Number of pages crawled per day: 13: Max"

I wonder then how this would change, anyone want to weigh in?

------

Never mind this post it changed:

"Number of pages crawled per day: 325: Max"