Forum Moderators: DixonJones
Currently I'm very crudely parsing the logfiles for a rough guestimate, and basically want to know (in an efficient organised manner) :-
How often the Googlebot visits pages (frequency)
How long it spends on pages (duration)
What pages/urls it visits
*All on a daily/weekly basis
After seeing several posts on the GBot either not returning for X period of time, or returning error Y in webmaster tools, is their an ideal existing solution for tracking access to my site(s)?
Most good log analyzers can track search engine spiders. The trick is to properly identify them. Try something aking to searching for "spider identification" top find a list of common bots.
So Parsing the log files is the right method... but using a log analyzer to do it is much better than "crude". I know that 5 years ago or more I was able to change the settings of Webtrends log analyzer's .ioni files to dramatically improve the spider identifications... but that product has changed a huge amount since then.
There's a fair few decent free log analyzers out there. Try and find one that allows you to play with a text file that identifies spiders.