Forum Moderators: open
They are simple solutions without having to download and learn Analog, Webtrends etc etc., or learn how to use grep!
But to answer your question: since I am on Linux, I am using grep, e.g., grep Googlebot access.log.29 > gb to "copy" all Googlebot records to the file gb. But you should be able to import the log files to Excel or use some other logfile evaluation tool, e.g., webalizer or http-analyze. You can then sort/group by the user-agent column and search for Googlebot.
I prefer a mix of both: the graphical tools to give me an overview and the GNU tools to dig in deep and dirty (e.g., to see all the referrer; check what people were searching for; who caused 404s; what browsers people (as opposed to bots) were really using). BTW: don't trust any browser (user agent) statistic unless they show you the raw log files. At least 50 % of Opera users cloak, many bots id as Netscape or IE, ...