homepage Welcome to WebmasterWorld Guest from 54.211.212.174
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Code, Content, and Presentation / PHP Server Side Scripting
Forum Library, Charter, Moderators: coopster & jatar k

PHP Server Side Scripting Forum

    
Avoid php "hanging" page
ktsirig




msg:4024320
 11:31 am on Nov 13, 2009 (gmt 0)

Hello,
I have a php page that the user enters some data in a textarea, and then this data is fed into a command line program. When the execution of the command line programm finishes, the results are stored in an output file that is created. The "results" php page then reads this file and the user views the results. My problem is that sometimes (depending on the amount of data inserted and, subsequently the size of the output file created by the command-line programm), the php results page crushes, outputting error like:


Fatal error: Allowed memory size of 16777216 bytes exhausted (tried to allocate 846269 bytes) in /results.php on line 125

In this line of the script, I read the whole file into a string and then separate each result chunk by a separator (//) that the command line programm creates:


$res_handle= fopen($output_file, "r");
$results = fread($res_handle, filesize($output_file));
fclose($res_handle);

$separated_entries = explode ("//", $results);

Then, foreach chunk, I output the results to the user. Is there something that I must configure in php so that it won't have memory problems? Or another way to slurp the total output file without exhausting the memory?

thank you!

 

londrum




msg:4024321
 11:39 am on Nov 13, 2009 (gmt 0)

you could try putting this at the top of the php script, which should work as a quick fix, but you'd probably be better off looking for ways to speed up the script.

ini_set("memory_limit","1000M");

timster




msg:4024396
 2:42 pm on Nov 13, 2009 (gmt 0)

I read the whole file into a string

I'd suggest splitting the input up and working with it in chunks, unless you really need to load these things into memory all at once. Looks like you're using >16 MB of memory to run a single process.

rocknbil




msg:4024582
 7:45 pm on Nov 13, 2009 (gmt 0)

How large or these files? If the file is under 2000 lines, you shouldn't be crashing, you have some issues in "how you are doing it." But a periodic dump-to-screen does free up memory as you go.

If they are larger files, consider forking the time intensive process using pcntl_fork(), PHP's implementation of fork(). Basically you will pass off the time-intensive process to the spawned child and return the result to the user immediately in the parent.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Code, Content, and Presentation / PHP Server Side Scripting
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved