Haven't done it (hmmm, you've got me thinking, though) but I can answer some concerns:
> I suppose I am after the pitfalls, for example if a browser does not support the unzip process.
The browser tells the server what formats it can handle using the HTTP-Accept header. The server can look at that header and send either the gzipped or plain-text versions. For example a simple mod_rewrite rule could be used to do this -- check the accept header and then either serve the static html page to browsers that don't accept gzip, or for those that do, serve the pre-compressed version (if it exists) or call the gzip routine to compress it if not.
> Should I zip on the fly or store cached versions of the page which have been gzipped. Or something else?
It would be a good idea to cache the gzipped files if you are hosted on a busy server. Otherwise, you effectively gain bandwidth, but at the cost of repetitive computation. On a busy server, getting enough CPU time to do the compression might be as slow as sending the uncompressed file - not likely, but possible.
A mixed approach may be best; gzip and cache the most-popular, largest pages, gzip (only) the medium-popular and medium-size pages, and don't do anything with the least-often-accessed and small pages. Reviewing your stats regarding bandwidth and CPU time consumed will show you the way.
While it might seem easier to just gzip everything, compressed-format files like jpg images and mp3 audio gain no benefit at all from being gzipped, and just cost you wasted CPU time, so some discrimination will be necessary.
Another issue is that you will need to modify your cache-related server response headers to include the "vary" parameter. This basically tells proxy and browser caches to cache the plain and gzipped pages separately, based on MIME-type.