Forum Moderators: open
Firstly let me that that like most people that join this board, I'm a long time visiter, first time poster.
I was just reading somewhere about using mod_gzip to reduce bandwidth, I also read that not every browser supports it, so would this in anyway effect the googlebot. Has anyone had any negative experiences with using mod_gzip to compress the data from the server?
Thank you!
Therefore older browsers and spiders are not affected by compression, they will just receive the normal page, which is theoretically slower than the compressed version.
In my experience, compression works very well on fairly large pages (40-100Kb) that involve repetitive code... ie a page that has a large table on it with many repeated rows.
The difference is well worth it for users on modems, and still noticeable over broadband.
Also, on pages that are very small in filesize (1-5Kb) the overhead in server CPU usage nullifies the advantage of compression, to the point where our servers (admittedly under heavy usage) would fall over at far lower load than when serving uncompressed data.
<added>
OK, it's easy to check. Plenty of people have 'envoronment variable' scripts in the Google cache.
Googlebot sends "application/x-gzip" in an HTTP "ACCEPT" header but not an "ACCEPT_ENCODING" header.
I'm not sure what tht means. Does it mean that Googlebot will accept a document of type gzip but that it doesn't want gzip'd HTML documents? I'm confused. At least mod_gzip can just default to uncompressed as rpking points out.
</added>
[edited by: ciml at 5:57 pm (utc) on Aug. 6, 2002]