Forum Moderators: Robert Charlton & goodroi
I've been looking at some of my site logs recently and wondering whether there's anything I can glean from them other than "oh, Googlebot's picked up my new pages" or "Wow, Googlebot's requesting a lot of pages today". Is it possible to infer anything from frequency or order that G accesses the pages? Repeated requests for a single page? Does the request order tell me anything about my link structure? I presume G works on some sort of logical basis, hence my query.
Apologies in advance if this is a noob subject...!
All comments welcome.
Gbot comes daily and requests robots.txt and root - at some busy sites he comes twice a day.
At least once every week or so he crawls deep through my sites visiting a lot of pages.
Constantly during those visits he tries to get some that doesn't exist:
GET /sortdir/asc HTTP/1.0
GET /eText/ HTTP/1.0
GET /eCat2/ HTTP/1.0
... always similar although 404ed.
New pages are best fed to him by deep links from home (better than from site map).
It takes max. 3 days for new pages to appear in the index after being crawled.