Welcome to WebmasterWorld Guest from

Forum Moderators: DixonJones & mademetop

Message Too Old, No Replies

How to determine a deep crawl



3:06 pm on Jan 27, 2005 (gmt 0)

10+ Year Member

i can tell through my logs how many hits i received from googlebot, but i cannot tell what or how many pages googlebot crawled. how can i find this?


9:26 pm on Jan 27, 2005 (gmt 0)

10+ Year Member



11:45 pm on Jan 27, 2005 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month

I would think any basic web log analysier can break this information out of your log files for you. I will not bother to list any names you can do a lookup on your favorite search engines for these web log analysier/reporting tools.


3:24 am on Jan 28, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

If you don't have or want a log analysis program that will let you run a report just on the googlebot User Agent (or whatever), you can get it using Excel or Access, just pull out the googlebot hits into a new file and sort by page. To eliminate duplicates in Excel (don't know about Access) do a Subtotals ... Count operation.

But at any rate, if you know the # of hits, you know the # of pages more or less because googlebot won't be hitting images or css files or whatever. However robots.txt will pad the total.


Featured Threads

Hot Threads This Week

Hot Threads This Month