Forum Moderators: coopster

Message Too Old, No Replies

Creating and compressing files

         

ocon

9:08 pm on Jan 1, 2012 (gmt 0)

10+ Year Member Top Contributors Of The Month



I'm trying to increase the response time of my site by creating a file cache.

Instead of having to recreate the .php page each time, my server will first look to see if it is already created and serve either the .html or .html.gz file depending on the user's browser. If the file does not exist then the .php page is loaded which also creates the two static files.

At the end of my php file I have the following code:

file_put_contents("/cache/".$name.".html", $file);
exec("gzip -9f /absolute/path/to/cache/".$name.".html");
file_put_contents("/cache/".$name.".html", $file);


On line 1 I create the file.
On line 2 I gzip this file to create the .html.gz version
On line 3 I recreate the file because line 2 has destroyed it.

I'd like to rewrite my code so line 2 doesn't destroy the original file, just saves a copy. That way I can eliminate step 3 which is a pure waste.

Is this possible?

rainborick

4:54 pm on Jan 2, 2012 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



If you're on an Apache server, you can always use a deflate instruction to handle the GZIP issues automatically. I used:

AddOutputFilterByType DEFLATE text/html text/plain text/xml text/css application/x-javascript

It only compresses the selected file types and only if the user can handle it.

ocon

6:47 pm on Jan 2, 2012 (gmt 0)

10+ Year Member Top Contributors Of The Month



Thanks for the reply. That would generate the php pages and gzip files on the fly for each page request by a supported browser, right?

Since my pages are built from a database what I'm trying to do is to create a regular static page and a compressed static page once and serve those files, and rebuild them only when the database changes. Increasing page load speeds, reducing bandwidth consumption, and server consumption.

rainborick

7:15 am on Jan 3, 2012 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



No. The .htaccess directive I posted will compress the outgoing data automatically. Whether the data comes directly from a plain HTML or text file, or the output from a PHP script, as long as the data is one of the selected MIME types, the output is compressed before it is sent to the user. No pre-processing is required. But on reflection, you should probably ignore my advice.

By pre-compressing the data, you're saving a good deal of server load as the number of requests for these files increases. On any kind of shared server, you could easily be losing more efficiency than the compression would save. Anyway, I believe that gzip won't by default overwrite the original file. It should create an archive with the original file name with a '.gz' extension added. At worst, you just need to use 'gunzip' to recover the original data.

penders

11:20 am on Jan 3, 2012 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



ocon: That would generate the php pages and gzip files on the fly for each page request by a supported browser, right?


rainborick: No. The .htaccess directive I posted will compress the outgoing data automatically.


Shouldn't that be a 'yes'?

@ocon: Have you considered using PHP's own gzencode() [uk3.php.net], rather than running gzip from the shell?

ocon

4:51 pm on Jan 3, 2012 (gmt 0)

10+ Year Member Top Contributors Of The Month



Thank you penders, that was exactly what I was looking for.