Forum Moderators: open

Message Too Old, No Replies

Googlebot/2.1 skipping pages on a crawl

         

javawookie

12:33 pm on Aug 26, 2003 (gmt 0)

10+ Year Member



Until about 6 weeks ago, Google was indexing on a daily basis all the pages on my site, now it still indexing the site except for one directory of 6 pages which is 3 levels deep from the root.

These pages are linkable from my site map and a page which is visited daily.

Won't Google crawl several folders deep?

Anyone else had this problem or can think of any ways
I can rectify this.

Thanks

Java

Mark_A

2:41 pm on Aug 27, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



so simmonsj_98 what exactly is it requesting? the root / with "search"?

And there really is no pattern in the requesting IP addresses to suggest someone hacking for smething from perhaps 4 different IPs?

If not then perhaps its evidence of some kind of distributed infection? interested to hear what others have to say ... hope its limited to windows servers :-)

takagi

2:57 pm on Aug 27, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Not sure if the reply by Mark_A will help you. So I will give it a try.

Unless you change the whole site at least once a day, I would guess there is no need for all pages to be crawled daily. So if these directory pages are rather stable, then it is enough if they are spidered once a month by Google. For most sites the majority of the sub pages are spidered once a month on average. So nothing to worry about.

Mark_A

3:03 pm on Aug 27, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Oops .. yes takagi you are quite right ... my post was meant for another thread .... oh dear ... to many windows open at the same time :-( sorry javawookie ignore my message ..