Forum Moderators: open

Message Too Old, No Replies

Will Using the Google Search Box Get Every Page Crawled?

For Large Sites

         

Go60Guy

2:13 pm on Apr 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I read on another board that using the Google site search box - [services.google.com...] - assures that all your pages will be crawled by Google.

Just wondering if there's any substance to this.

jpjones

2:23 pm on Apr 1, 2003 (gmt 0)

10+ Year Member



Following their sign up process - they say "we have X number of pages in our index".

I personally doubt it will force them to spider ALL the content of a given site, as it doesn't make a lot of sense. Google having a publically available back door to spider anything and everything? It doesn't wash with me.

Anyone care to prove me wrong?

JP

creative craig

2:28 pm on Apr 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Nope your right, the pages have to be in the Google index for it to show in the Google search box for any site.

You still have to have the usual SEO'd site to make it in there, not just the search box ;)

Craig

Go60Guy

2:32 pm on Apr 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yeah, I would agree that you would have to have pages already indexed. But, can using the site search box at least assure that all currently indexed pages will be crawled?

hetzeld

2:36 pm on Apr 1, 2003 (gmt 0)

10+ Year Member



Go60Guy,

When you apply for the free search, it is clearly stated that Google doesn't guarantee spidering of your pages, not even a single one...

Dan

creative craig

2:37 pm on Apr 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



No because the search box just uses the Google index the same way that you do when you hit the search button on the home page!

jpjones

2:38 pm on Apr 1, 2003 (gmt 0)

10+ Year Member



In my mind search and crawl are two different processes.

Pages will be crawled by googlebot according to what it finds on other pages and the algorithm it follows.

These crawled pages will then be indexed into the database ready for returning as search results.

The user searches, and the matching results are taken from the current index.

So long as you don't do anything dodgy on your site with regards to SEO, there should be no reason that googlebot will stop crawling all the pages it currently has in it's inde for your site. It might even pick up new ones from a natural crawl. There is no way to force it to pick up the rest of your site, other than making it useable by following standard practices.

JP