Forum Moderators: DixonJones
Is your site dynamic (PHP/ASP)?
If so, or if you could add server site scripts to your pages, then you could self tag your site with a short script that would register all page requests. Server side tags would not be dependant upon client-side Javascript to execute (which, of course, is why the bots don't register as they don't execute client-side Javascript).
The script would just record its own log data for you to analyze. You would need to use analytics software to analyse those logs.
Hope this puts you on the right path,
Larry
Login to that - if you have one - and under the 'Diagnostics' tab you can see a new link added as 'Crawl Stats' .
This will give you last 90 days activities of Googlebot in your site. This includes Reports for, Number of Pages Crawled per Day, Number of Kilobytes Downloaded per Day and Average Time Spend for Downloading a Page.
you can read about it here... [kichus.in]