Forum Moderators: open

Message Too Old, No Replies

Any suggestions or ideas why Gbot will not deep crawl

Lots of suggestions but no success

         

derekwit

12:49 am on Jan 6, 2003 (gmt 0)

10+ Year Member



I have read many posts and it seems that everybody looks to you for answers I am hoping you can get the time to assist me. I have (2) different threads going now sorry about that. One is where Gbot has deep crawled one of my pointer domains: [webmasterworld.com...]

The other is where Gbot will only deep crawl my index and robots.txt page. I have validated all pages checked with hosting company checked and rechecked code. I am at a loss! Based on Gbots past and my site he usually will deep crawl me twice. [webmasterworld.com...]

This month he came by yesterday and he should revisit around the 8th - 11th. Any suggestions you may have I would greatly appreciated. Thanks in advance for any help you can give!

nutsandbolts

1:01 am on Jan 6, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



My first suggestion is to really ramp up the links to your site. I can only see a small amount of low-ranked sites linking to you. Mmm, has your site been in Dmoz?

Marcia

1:19 am on Jan 6, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



We really don't review individual sites (check the charter) and then there's this post about direct appeals to GoogleGuy [webmasterworld.com].

There can be any number of reasons, some of which were mentioned in those other threads, and others that can be dug out by reading through the forum.

With over 3 billion pages, Google just doesn't deep crawl all every time, which is the simplest explanation. There can be any number of reasons why some are and some aren't. It can depend on PR, linking, robots.txt and lots of things, not the least of which is that of dynamic pages - depending somewhat on the number of parameters.

Basically, it takes digging deep into each issue that's a possible cause, correcting what's found, and then giving it time. Google just doesn't do it all every month.

derekwit

1:42 am on Jan 6, 2003 (gmt 0)

10+ Year Member



Yes my site is in Dmoz. I also have about 30 backlinks of which 3 are PR6 and 2 PR5. I also have dug extremely deep through numerous threads. Only to come up with there seems to be something I am missing. i have researched the fact the cfm files are listed in Google quite a few. I have also researched meta tags robots.txt files and how Gbot visits based on previous encounters. After doing some serious log studies I have determined that Gbot spidered my entire site on 10/08/02. Great right. Wrong I made the serious mistake of having multi-domains pointing to my main domain. I know dumb mistake sometimes for me at least it is the only way to learn. Now how do I fix the problem based on the IP that I found for the deep crawl it shows up for the last 2 months on 2 different occurences 1 on or around the 4th and the other on or around the 8th. This means I have only 3 more days this month. I have taken the advice from previous threads. I called my hosting company and checked on server side errors. No dice! Working fine. I checked on those 404 errors and it seems that particular site gives the same error for every page I look up. Any ideas would be great. Thanks for everything that has been done so far.

onlineleben

9:55 am on Jan 6, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Don't know if it helps:
-- some of your page names have spaces in them
-- the phrase "reciprocal links" on your links page could imply that you are running kind of link-farm
-- your directory structure is quite deep. concentrating pages nearer to the root could improve spidering