|HTML parsing PHP server load|
Will this effect the server load by a lot?
I'm trying to re-do a site to using PHP include for navigation and the like. The main reason being that I don't want to rename all the pages to shtml or php, to avoid broken links/awkward re-directs and general confusion.
The network admin is concerned that parsing all of the html files as php is going to create a significant increase on the server load. I was asked to look into it. Does anyone have any idea how great of an impact that will have? We get around 35,000 requests per month.
This is my first post here, so I'm not sure if I'm asking in the right place. Please feel free to re-direct me (HA!) if needed.
35K requests per day is barely a server load these days unless you have ancient hardware.
35K per month is an idle server.
Most sites, really busy sites, run PHP all day long for forums, blogs, etc. for every single page.
Your network admin is far too cautious.
So, and forgive my ignorance, does forcing the server to parse html files as php produce more load than simply using php files to begin with? Rather, if I were to re-make every html file as a php file would the load be the same?
I appreciate your input.
The load should be the same if you add PHP to pre-process all .html files (I do this on all my servers) or re-make them to .php files. I don't see the difference really because the amount of execution should be identical.
In either case PHP has to process your HTML content looking for the begin and ending PHP tags that specify where the code exists. You gain nothing by telling it in advance it's a .PHP file except Apache already knows what to do with it immediately instead of processing yet another directive. Even knowing it's a .PHP file, PHP has to still process all the HTML to find the PHP tags, same difference either way.
How you run PHP on your server is more likely to impact performance than PHP itself. Some run PHP as a CGI or FASTCGI instead of installing it as an Apache module. If your server has it installed as a CGI then it's going to suck wind.
Easy way to figure exactly how much time you're adding per page is using real numbers from page load performance testing tools and try your site before you add PHP and after adding PHP.
Google Page Speed for Chrome would be a good start:
However, unless you're running your site on a Commodore-64, or the chip powering your server is a Dorito, you shouldn't notice much of delay, typically milliseconds, not seconds, unless they're stupidly running PHP as a CGI.
If someone else knows better they can chime in here.
It should be fine and no it shouldn't be "significant," but what your admin is suggesting is it's more load than necessary. Which is absolutely true. Think about this,
|I don't want to rename all the pages to shtml or php, to avoid broken links/awkward re-directs and general confusion. |
I don't know what you mean by "awkward redirects", as you shouldn't need any, and in **either** case you shouldn't have to rename anything (which would be bad - because then you'd have to 301 all URL's for search engines). But what's relevant here is that what you'll be doing for PHP is the exact same thing you'd do for SSI's.
- Add a handler to your .htaccess or server config to parse htm html as PHP OR SSI (depending on which you choose, can't do both obviously)
- modify the areas of all documents for the inclusion
In terms of the tasks you'll perform, it's the same thing. Instead of calling the rich server side scripting language PHP with this,
<?php include ($_SERVER['DOCUMENT_ROOT] . '/includes/header-nav.html'); ?>
You'd call the server side includes parser, which is really "the right tool for the job" if that's all you're doing.
<!--#include virtual="/includes/header-nav.html" -->
Will it be less load on the server? Likely, but I don't know. My point is there's no reason to call in a rich server side programming language if all you're doing is including a static file. You can even use SSI's to set "current page" values if you want, or even execute scripts:
<!--#include virtual="/includes/some-script.php" -->
<!--#exec cgi="/cgi-bin/perl-script.pl" -->
I'm posting to add my agreement to what incrediBILL said, in case your admin remains skeptical until there are multiple replies.
|The main reason being that I don't want to rename all the pages to shtml or php, to avoid broken links/awkward re-directs and general confusion. |
Yes, good strategy.
rocknbil, I believe what the OP meant was awkward redirects if they renamed .htm pages to .php. The renaming is the thing to avoid.
I'd still go with PHP includes over SSI, because it makes available the other features of PHP, which many people eventually decide they want, anyway. If you've converted to SSI, then you'd need a 2nd conversion, from SSI to PHP.
PHP as CGI is said to be slower, and theoretically should be, but on a shared server it does allow you to be much better walled off and protected from your neighboring sites on the server. Folders and files that need world-writable permissions (777/666) under Apache module can be made writable only by the owner (755/644) with CGI. In a shared environment, that is a worthwhile tradeoff.
I've used both module and CGI and although CGI probably is slower, it was nothing that I could notice as a user.
CGI is slower because the PHP interpreter, instead of being continuously loaded in memory and ready for use as a module, must be invoked from disk each time a .php page is to be processed.
However, the difference could be completely negligible if the server caches its most recently used disk files to memory for faster re-access. Windows does that as a matter of course, and I suspect Linux does it, too. In that case, the PHP interpreter would just be getting loaded from memory (at least most of the time), whether PHP is a module or a CGI.
|then you'd need a 2nd conversion, from SSI to PHP |
I've currently got a site doing both:
|AddType application/x-httpd-php .htm .html .shtml |
and it gets worse:
It's still operating at decent speeds.
However, re: CGI. caching may speed it up somewhat but it's not a persistent binary instance loaded in the machine. There are startup procedures, loading PHP.INI, blah blah etc. which happen each time a CGI is processed that are already done in advance with the Apache mod. For instance, loading browscap.ini each time you start PHP would be killer.
True while you might not notice a speed issues on a machine with a low to moderate load, I certainly wouldn't want to use a CGI loaded version for a high performance site.
Also, loading PHP as CGI has other issues, like using certain PHP INI values in the .htaccess file doesn't work unless you load yet MORE software on the server, etc.
Not that I wouldn't do it if I had no choice, just don't recommend it.
Well, the network admin mentioned using the XbitHack to accomplish what I want without parsing the html pages as php.
The problem with SSI is that it normally requires that you use the shtml extension which would require new files names (leading to awkward redirects). I guess this hack allows SSI on html pages. It requires that you change the permissions on each file to 744. While that's not the a big deal, as it stands the web server doesn't accept my chmod changes and reverts them when they gets copied over. So once that's worked out, I will see if this works.
Now what happens when I want to use php on pages? I guess just make them php pages.
Well, thanks for all of your input. I've learned a lot at the very least.