Forum Moderators: coopster

Message Too Old, No Replies

Do big PHP scripts demand system resources

         

Jeremy_H

5:49 am on Apr 24, 2006 (gmt 0)

10+ Year Member



I'm looking into using a specific script for on-the-fly pdf generation.

The one I'm looking at is based around PHP, which was something I sought out.

I'm concerned because at its heart is a 40K php file of PHP commands and variable declarations.

I strive to have my site as optimized as possible, and I know that the PHP file is ran in the background, and is not transferred to the user. However, I'm wondering if this 40K file running each time a pdf file is requested will become a server processing hog.

Any advice?

freeflight2

6:57 am on Apr 24, 2006 (gmt 0)

10+ Year Member



don't worry until your sites generates a couple M pvs/mo. Then eacellerator can help alot

hakre

11:41 am on Apr 24, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



dear Jeremy_H,

there are a lot of classes with some more k (even more then 40k) of filesize available and they do not kill your server. if you need to analyze how many resources are used by php, you can do so by using some information functions, in your case an important one would be memory_get_usage [php.net] which returns the amount of memory, in bytes, that's currently being allocated to your PHP script (incl. the 40k include).

you should then compare the gained value with your servers memory and the number of users. to get some info about the execution time the script needs, checkout microtime() [php.net]. with the prodvided examples you can meassure the time your script needs to execute.

the rest is calculation only. i.e.: if one user creates a usage of 10 mb per request for 1 minute and your server has got 500mb of ram for php you can handle 50 user / minute to reach you maximum memory usage (in theory). keep in mind that the webserver itself will consume memory, too, and that there are limitations about how many processes can be forked etc. pp., so this is only "in theory".

but that made you should get a picture of what is happening and how heavy the impact of the library you're using really is.

often you can create an "accelerator" by yourself, for example by caching created pdf documents or by replacing content in an already generated pdf. so even if that script is a process hog, there might be multiple ways to a solution.

whoisgregg

1:59 pm on Apr 24, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I am also about to implement a big PDF generation project on one of my sites.

I plan to circumvent the issue of CPU usage by caching as much of the PDF generation as possible. Even if you are personalizing each PDF, you may find some base elements can be generated and cached in advance.

Example: If you are making widget data sheets with the customers price quote on them, you could build and cache the PDF with the product picture, description, etc. then add the price quote at the point of request.

hakre

3:29 pm on Apr 24, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



whoisgregg, which pdf/php library are you using for that type of PDF generation? have you experimented with other documents like TIFF or similar?

whoisgregg

5:42 pm on Apr 24, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



For this next project, the PDF choice was chosen to provide documents in a widely understood, accepted, and as small as possible file format for printing by the visitor. The documents contain graphics but are mostly text. TIFF would not meet those goals as well as PDF.

I *was* going to use PDFlib, but I've recently decided to give FPDF a shot. I'd bet that the OP is looking at the same two packages. ;)

hakre

7:06 am on Apr 25, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I stumbled over FPDF some time ago and i can only say the best. Just cool creating some PDF ;)