homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe to WebmasterWorld
Visit PubCon.com
Home / Forums Index / WebmasterWorld / Website Analytics - Tracking and Logging
Forum Library, Charter, Moderators: Receptional & mademetop

Website Analytics - Tracking and Logging Forum

How to determine a deep crawl

 3:06 pm on Jan 27, 2005 (gmt 0)

i can tell through my logs how many hits i received from googlebot, but i cannot tell what or how many pages googlebot crawled. how can i find this?



 9:26 pm on Jan 27, 2005 (gmt 0)



 11:45 pm on Jan 27, 2005 (gmt 0)

I would think any basic web log analysier can break this information out of your log files for you. I will not bother to list any names you can do a lookup on your favorite search engines for these web log analysier/reporting tools.


 3:24 am on Jan 28, 2005 (gmt 0)

If you don't have or want a log analysis program that will let you run a report just on the googlebot User Agent (or whatever), you can get it using Excel or Access, just pull out the googlebot hits into a new file and sort by page. To eliminate duplicates in Excel (don't know about Access) do a Subtotals ... Count operation.

But at any rate, if you know the # of hits, you know the # of pages more or less because googlebot won't be hitting images or css files or whatever. However robots.txt will pad the total.

Global Options:
 top home search open messages active posts  

Home / Forums Index / WebmasterWorld / Website Analytics - Tracking and Logging
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved