Welcome to WebmasterWorld Guest from

Forum Moderators: coopster & jatar k

Message Too Old, No Replies

Avoid php "hanging" page

11:31 am on Nov 13, 2009 (gmt 0)

Junior Member

10+ Year Member

joined:Sept 26, 2005
votes: 0

I have a php page that the user enters some data in a textarea, and then this data is fed into a command line program. When the execution of the command line programm finishes, the results are stored in an output file that is created. The "results" php page then reads this file and the user views the results. My problem is that sometimes (depending on the amount of data inserted and, subsequently the size of the output file created by the command-line programm), the php results page crushes, outputting error like:

Fatal error: Allowed memory size of 16777216 bytes exhausted (tried to allocate 846269 bytes) in /results.php on line 125

In this line of the script, I read the whole file into a string and then separate each result chunk by a separator (//) that the command line programm creates:

$res_handle= fopen($output_file, "r");
$results = fread($res_handle, filesize($output_file));

$separated_entries = explode ("//", $results);

Then, foreach chunk, I output the results to the user. Is there something that I must configure in php so that it won't have memory problems? Or another way to slurp the total output file without exhausting the memory?

thank you!

11:39 am on Nov 13, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Feb 12, 2006
votes: 22

you could try putting this at the top of the php script, which should work as a quick fix, but you'd probably be better off looking for ways to speed up the script.

2:42 pm on Nov 13, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Aug 1, 2003
votes: 0

I read the whole file into a string

I'd suggest splitting the input up and working with it in chunks, unless you really need to load these things into memory all at once. Looks like you're using >16 MB of memory to run a single process.

7:45 pm on Nov 13, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member rocknbil is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Nov 28, 2004
votes: 0

How large or these files? If the file is under 2000 lines, you shouldn't be crashing, you have some issues in "how you are doing it." But a periodic dump-to-screen does free up memory as you go.

If they are larger files, consider forking the time intensive process using pcntl_fork(), PHP's implementation of fork(). Basically you will pass off the time-intensive process to the spawned child and return the result to the user immediately in the parent.