Forum Moderators: coopster
I'm trying to build a PHP-based CMS and I've been thinking about images.
I was wondering if using functions like file_exsists and using GD to figure out the width and height of an image were particularly computationally expensive.
Basically I'm thinking of inserting all images in the page with a PHP function which will first check if the image exists (to avoid broken image links) and then creates the image tag automatically complete with width and height attributes (to avoid 'page popping' where the content keeps 'popping' down as the images are loaded in).
Is that likely to put a lot of extra load on my server or noticably increase page load times?
Thanks!
Since you'll be looking at a very small number I suggest you set up a loop to do the machination several times (like 10 or 100) then divide the result:
$s = microtime();
for($i = 0; $i < 100; $i++) {
// do stuff
} // EndFor a bunch of times
$e = microtime();
$r = ($e - $s) / 100;
echo "Operation took an average of $r seconds";
You may even want to hit refresh 4 or 5 times - you'll likely never see the same answer twice since the server is also running off to do other things. For this particular operation I would also set up an array of about 10 different pictures so you know nothing's being optimized and throwing the test (for example the server might see 'oh I have this image cached so I don't have to pull it from the drive again'). Keep in mind that a "little bit" of the reported processing time is overhead for your loop and optimization busting.
I expect you'll see a number that's less than a millisecond (.001 second). Compared to the time it takes to transmit the page, in my opinion just about anything we're likely to do in script is negligible - it's not until you start doing things like pulling several thousand records from a database table to twiddle around that you have to wonder if there's a better way to do what you want to do (imHo ;) ).