I would think any basic web log analysier can break this information out of your log files for you. I will not bother to list any names you can do a lookup on your favorite search engines for these web log analysier/reporting tools.
If you don't have or want a log analysis program that will let you run a report just on the googlebot User Agent (or whatever), you can get it using Excel or Access, just pull out the googlebot hits into a new file and sort by page. To eliminate duplicates in Excel (don't know about Access) do a Subtotals ... Count operation.
But at any rate, if you know the # of hits, you know the # of pages more or less because googlebot won't be hitting images or css files or whatever. However robots.txt will pad the total.