Forum Moderators: coopster
For download the big files (more than 8MB), you must used ob_flush()
I might consider a while loop to process a large file in chunks, but, I have no practical experience to offer. Sorry.
My problem is that these functions seem to have a restriction as to the size of the file that they can read. Is there any other way I read big files (for files close to 20-30MBs for example)?
in general there might be restriction on the filesize iteself but this is not the point here. since you read in the whole file into memory, you need to allow php to consume more memory. check the limit on your server and raise them. you will have no problems in the future.
the other solution is to process not the whole file in memor y at once (file_get_contents does this), but - as already suggested - to use a buffer of some bytes only, just enough bytes to parse the file.
--hakre