Googlebot does read the pages so no worries about activating gzip, but it reads the uncompressed page.
Mod_gzip will show the uncompressed page whenever the client who is visiting the page does not give a 'Accept-Encoding: gzip' header and Googlebot probably does not do that since I verified in my own logfiles it reads the uncompressed version. (ie. 31057 bytes against 4856 bytes compressed).
on a sidenote: inktomisearch does read the compressed pages (Yahoo Slurp bot)
GG often talked about it in the past and said that he can't see a reason not to do it. Now knowing that Y! does it I wonder why G, having tens of thousands of servers, can't do it. Or if there's a reason against it.