Forum Moderators: DixonJones

Message Too Old, No Replies

How to track the Googlebot

Analysing Googlebot access to a site

         

nick279

7:42 am on Sep 22, 2008 (gmt 0)

10+ Year Member



After trying several packages recently, I just wanted to see if anyone is using a proven solution for tracking the Googlebot.

Currently I'm very crudely parsing the logfiles for a rough guestimate, and basically want to know (in an efficient organised manner) :-

How often the Googlebot visits pages (frequency)
How long it spends on pages (duration)
What pages/urls it visits

*All on a daily/weekly basis

After seeing several posts on the GBot either not returning for X period of time, or returning error Y in webmaster tools, is their an ideal existing solution for tracking access to my site(s)?

Receptional

12:07 pm on Sep 26, 2008 (gmt 0)



This is one of those time when raw log files are far more useful than the now more common way of measuring visitors - via javascript.

Most good log analyzers can track search engine spiders. The trick is to properly identify them. Try something aking to searching for "spider identification" top find a list of common bots.

So Parsing the log files is the right method... but using a log analyzer to do it is much better than "crude". I know that 5 years ago or more I was able to change the settings of Webtrends log analyzer's .ioni files to dramatically improve the spider identifications... but that product has changed a huge amount since then.

There's a fair few decent free log analyzers out there. Try and find one that allows you to play with a text file that identifies spiders.