Forum Moderators: coopster

Message Too Old, No Replies

Efficient method of combining and minifying files

javascripts, stylesheets etc

         

Mike521

1:47 pm on Aug 2, 2011 (gmt 0)

10+ Year Member



I'm working on some miscellaneous things to speed up my site, the next of which is combining my various javascripts into one file. I'm not sure if the result could slow down the server since it makes it work harder though, can someone give me some input?

Currently I have at least 3 separate js files (living on my domain) that are included on every page, which aren't minified. There may also be one or two other js files pulled in from my domain depending on the page. FYI I also load jquery and jquery ui from the google domain

So I have two goals: first, decrease the overall number of http requests, and second, decrease the overall size of the downloaded files.

I found a handy PHP script someone had written, which outputs the appropriate headers and uses readfile() to read in the various javascripts. So this solves my first goal. To solve my second goal I'm thinking about using an output buffer in the script, grabbing all the various javascripts, and then using PHP to do some minifying (remove tab characters and newlines at the very least).

Is that an efficient way of speeding up the site, or could the process of grabbing all the files and minifying them actually slow down the server?

[edited by: Mike521 at 2:28 pm (utc) on Aug 2, 2011]

httpwebwitch

2:17 pm on Aug 2, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



good timing! - I'm looking into doing this too. I've done manual minifying and awkward custom squishifying in the past, but now I'm interested in finding a good PHP-based solution to do the same thing.

I'm including mootools from the Google API, plus a pile of other scripts depending on what page you're on.

Have you looked at this?
[code.google.com...]

Mike521

2:28 pm on Aug 2, 2011 (gmt 0)

10+ Year Member



that's interesting, thanks I hadn't seen that! The script I found was from a blog post, it's good but maybe I'll use the Yahoo one instead.

I noticed that yahoo says if you have a very high traffic site it might actually be slower because of the PHP overhead. I don't think I'll have that problem (unfortunately!) but that's exactly what I was wondering about. Looks like it can technically be a problem but only on a huge site..


edit: my mistake, I thought this was a yahoo script when skimming the page you linked

penders

2:37 pm on Aug 2, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I believe you don't really need to minify it as long as you gzip it (and combine the files to minimize the number of http requests).

I found a handy PHP script someone had written, which outputs the appropriate headers and uses readfile() to read in the various javascripts. So this solves my first goal.


I would have thought this could solve both goals? By the sounds of it the script might already be doing the hard part. It may not be so much of step for the script to also gzip it, ...and cache it server-side (would be a bonus)?

gzip compression can be handled entirely by the server I believe in .htaccess (without having to use PHP). Although I can't imagine this is then cached? (And it won't combine files to minimize requests.) May be someone can provide the necessary info on this...?

Mike521

2:46 pm on Aug 2, 2011 (gmt 0)

10+ Year Member



you're right, I don't know why I didn't realize that, the script is already gzipping so I don't need to worry about part 2!

when you say cache it server side, do you mean save the output as a temp file somewhere, and then read/output that until it becomes outdated?

Mike521

2:49 pm on Aug 2, 2011 (gmt 0)

10+ Year Member



actually I found a thread on stackoverflow about this, long story short, you technically get a smaller file size if you minify first and then gzip. But it doesn't seem to be the end of the world if you simply gzip

penders

3:16 pm on Aug 2, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



...when you say cache it server side, do you mean save the output as a temp file somewhere, and then read/output that until it becomes outdated?


Yes. I assume this would be required on a busy server in order to minimise stress on the processor from repeatedly compressing files?

Mike521

3:38 pm on Aug 2, 2011 (gmt 0)

10+ Year Member



I agree penders. So for now I ended up with a script that does the following:

- create an array of files to include
- check each file against the latest date of the cache file
- if cache file is up to date, read and output that, then exit
- else, grab contents of each file
- minify
- write contents to cache file, output contents, exit

httpwebwitch

5:53 pm on Aug 3, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I haven't experimented with that minify project yet. Any chance it might use memcache? I guess I could read the project specs but if you've already investigated, d'ya know?

minified, gzipped and combined JS files sitting in memcache and e-tagged for browser caching is probably the fastest solution imaginable.

Mike521

1:27 pm on Aug 4, 2011 (gmt 0)

10+ Year Member



I'm not sure, I actually didn't end up using that one. It seemed a little heavy for a task that I wanted to be simple, quick and efficient - I was worried it had a lot more bells and whistles than I needed. I ended up finding another small PHP class for JS minification.

[code.google.com...]

Off the top of my head I don't think either of them use memcache though

httpwebwitch

7:39 pm on Aug 4, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Wow, I just installed minify, and it took - no joke - less than 5 minutes from the time I downloaded the source, to having all my JS minified on my site.

Get the project. Inside it is a folder called "min". Put that in your public_html (aka document root), so it exists like this: http://example.com/min/.

My site has a common header, which is included on every page.
In that header script, where I formerly had a pile of <script> tags pointing to a menagerie of files, I replaced them with this:

<?php
if (!isset($jsfiles) || !is_array($jsfiles)){
$jsfiles = array();
}
$jsfiles[] = "js/mootools-1.2-more.js";
$jsfiles[] = "js/modalbox.js";
$jsfiles[] = "js/searchbox.js";
$jsfiles[] = "js/footer.js";
$jshtml = '<script type="text/javascript" src="/min/f=' . implode(",",$jsfiles) . '"></script>';
echo $jshtml;

?>

Now, on pages that have additional scripts besides the standard set, I add them to the PHP array before the header, like this:

<?php
$jsfiles = array("js/custom.js","js/special.js"); // I added this line
include("header.php"); // this was already there
?>

and boom! it's done!

The compressed JS is cached on disk, in a private folder I define in the config.

And, I can confirm that the entire pile of compressed scripts is being cached properly by the client, with repetitive requests returning a "304 Not Modified" header. Then if I make a change to one of the component scripts - just adding a tab of whitespace - the next request gets the new content with "200 OK", then the following request is "304 Not Modified" again.

I am very impressed

httpwebwitch

7:44 pm on Aug 4, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'm going to do my CSS next. Watch out, interwebs! This speed is gonna burn you up!

Mike521

8:13 pm on Aug 4, 2011 (gmt 0)

10+ Year Member



haha I know what you mean!

One thing to think about, I have the same situation as you where there may be some global scripts as well as some that only run on certain pages. If I read your code right, your users will download a couple of scripts such as:

1. /min/f=whatever.js (contains all the main scripts)
2. /min/f=whatever2.js (contains a few subscripts, PLUS the main scripts again)

So your users are accidentally downloading all the main scripts a second time when they visit a subpage. That's what I had going on, I ended up splitting it into two script tags. One for all the main scripts and another for the subscripts.

I didn't like having to make them request two scripts but I can't think of any way around it. My subpages all have their own unique scripts so users will end up re-downloading the main scripts on almost every subpage unless I split the subscripts out.

httpwebwitch

8:26 pm on Aug 4, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I've got it set up so all scripts on every page are all condensed into one.

the PHP array contains all the scripts needed for each page. On individual pages, I'll define it:

$jsfiles = array("one.js", "two.js");

then in the header, I'll add some more:
$jsfiles[] = "three.js";
$jsfiles[] = "four.js";

the result is an array that contains ["one.js","two.js","three.js","four.js]

implode that into a comma-separated list, and that's the parameter that gets passed into the "min" script - minifying them all into one condensed and compressed lump.

So, I have my boilerplate scripts included on every page, and some custom ones that are only needed on specific pages, but on any given page there's only one <script> being requested.

It does mean that each page with custom additional ingredients produces a unique lump. And, that needs to be downloaded. But... since it's cached, the next person to hit that page gets the cached pre-squished lump, with no extra processing needed.

You could keep things in two lumps, and there are advantages to that... browser caching being the most obvious. Maybe it's a good thing that your "main" ones are all lumped together, downloaded once on the home page, and cached in the browser forever after. Then only the custom scripts on individual pages are being downloaded.

try both, test them, see which one is faster!

httpwebwitch

8:55 pm on Aug 4, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



minifying my CSS was just as easy, using the same array building technique.


on my page:

<?php
$cssfiles = array("css/justthispage.css");
include("header.php");
?>


in my header.php file:

<?php
if (!isset($cssfiles) || !is_array($cssfiles)){
$cssfiles = array();
}
$cssfiles[] = "css/style.css";
$cssfiles[] = "css/header.css";
$cssfiles[] = "css/footer.css";
$csshtml = '<link type="text/css" rel="stylesheet" href="/min/f='.implode(",",$cssfiles).'" />';
echo $csshtml;
?>

Mike521

9:22 pm on Aug 4, 2011 (gmt 0)

10+ Year Member



cool I might do something similar with the CSS also, luckily I only have two stylesheets per page (served from my domain at least, not counting jquery UI etc) but they can still benefit from being minified

yea I went for having two script tags that way users can have the main scripts cached for their entire visit, separate from the subscripts. I'll monitor and see how it goes

I like to think that the site is noticeably faster but I'm sure that's just being optimistic : )

httpwebwitch

3:09 pm on Aug 5, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



My situation is identical. Now that I've thought about it for a bit... I might also go with 2 <script>s on each page. Not counting the ones that are hosted externally, e.g. Mootools hosted by the Google API.

I don't know if the site really is loading much faster. Hard to tell because I didn't perform any benchmarking before & after. But now the browser caching is being done correctly, and I know it wasn't before, so in theory it must be quicker.

Since minify is an open source project, I might get my hands in there and add a hook for storing the server-cached JS in memcache instead of on disk. A fun project for some day if I get bored.