Forum Moderators: open
Just wondering if there's any substance to this.
I personally doubt it will force them to spider ALL the content of a given site, as it doesn't make a lot of sense. Google having a publically available back door to spider anything and everything? It doesn't wash with me.
Anyone care to prove me wrong?
JP
Pages will be crawled by googlebot according to what it finds on other pages and the algorithm it follows.
These crawled pages will then be indexed into the database ready for returning as search results.
The user searches, and the matching results are taken from the current index.
So long as you don't do anything dodgy on your site with regards to SEO, there should be no reason that googlebot will stop crawling all the pages it currently has in it's inde for your site. It might even pick up new ones from a natural crawl. There is no way to force it to pick up the rest of your site, other than making it useable by following standard practices.
JP