Forum Moderators: open
Has anyone had experience with it?
Did it make a big difference to download times?
Were there any browser compatibility issues?
Is it easy to install on a Windows server?
Are the server overheads dramatic?
Any advice would be greatly appreciated.
I know Andy King [google.com] goes into it in his book, mentioning that it does give some significant benefits.
I've not used it, but would love to give it a try and learn how it works...
I have been reading all morning about it and my biggest worry is that no one has anything bad to say about it at all. Even the best technology has its detractors - but http compression seems to be universally lauded (by those that have used it).
I found a tool that lets you plug in a website address and it gives you back the page size and then the size that compression would have shrunk it to and I must say it is pretty bloody impressive:
My homepage would be compressed from 13,366 bytes to 3,344 bytes.
This obviously means reduced bandwidth and quicker delivery of the site. However why is it that such a small fraction of websites use it (apparently only 41 of the fortune 500 companies websites)? Is it ignorance or is there something I'm missing.
Based on what I have read it would be insanity not to use it.
Past discussions:
If I can get a quick answer from my hosting people I'm going to give this a try over the next few days.
As I understand it (and I only have a vague understanding of this, so someone please correct me if I have this wrong) there is a potential downside to using compression and that downside is going to be inconsequential for most pages, but important for a few (usually bad) pages.
The downside is this - you buffer output and save it all up to send in one go. On a relatively small page, that's great. On certain table-based layouts that can't render until most of the page is loaded anyway, that's still great.
On a small number of huge text-based pages (giant long tables or huge chunks of text) you *may* want the data to get sent piecemeal so that the user doesn't have to wait until it all gets buffered, sent and decompressed and rendered.
Pure speculation: Text is highly compressible, so let's say you have 10 screens of text (think W3C specifications). That will take a couple of seconds to download and render and you may want the user to be able to start reading. Given the compressibility of text, however, if you use compression, the user should get the whole thing at once in about the same amount of time. Now let's assume that you absolutely have to put your 250-page manifesto on the web in a single page (think Project Gutenberg). No currently available compression algorithm is going to get that entire page to the user as quickly as getting her the first page and then filling in the rest without compression. Why would anyone put 250 pages on a single page? I have absolutely no idea (even Project Gutenberg and such usually offer download of the whole zipped file or they break the book into parts). That situation being so rare, I think people rarely bother to mention this potential downside of using compression.
Tom