Forum Moderators: coopster

Message Too Old, No Replies

Question on files?

         

ktsirig

11:02 pm on May 1, 2006 (gmt 0)

10+ Year Member



Hi,
I am using PHP to make a website related to biology. In this site I use PHP code to communicate with external programms (not in PHP, but some biological packages). These programms write their output in temporary files, which I then read using fread() or file_get_contents() functions.
My problem is that these functions seem to have a restriction as to the size of the file that they can read. Is there any other way I read big files (for files close to 20-30MBs for example)?
Please note that there is no other way for me to do this because these programs, by default, write their output to a text file, so I can't use PHP's system commands and read the output on the fly. The output file is first created and then I read it into a string and parse it according to what I need.

grandpa

6:38 am on May 2, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Check the user notes on the fread [us2.php.net] function and see if they don't help.
For download the big files (more than 8MB), you must used ob_flush()

I might consider a while loop to process a large file in chunks, but, I have no practical experience to offer. Sorry.

hakre

10:29 am on May 2, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



My problem is that these functions seem to have a restriction as to the size of the file that they can read. Is there any other way I read big files (for files close to 20-30MBs for example)?

in general there might be restriction on the filesize iteself but this is not the point here. since you read in the whole file into memory, you need to allow php to consume more memory. check the limit on your server and raise them. you will have no problems in the future.

the other solution is to process not the whole file in memor y at once (file_get_contents does this), but - as already suggested - to use a buffer of some bytes only, just enough bytes to parse the file.

--hakre