Forum Moderators: coopster

Message Too Old, No Replies

Text file reading

         

babloo

2:10 pm on Oct 11, 2004 (gmt 0)

10+ Year Member



Hello,

Can anyone please help me regarding this issue. I have a text file having more than 10000 records in it. My problem is that when I open the text file through php and read, it is taking lot of time and my system hangs up. Sometimes max execution time error also is giving. Is there any solution to give a better performance for this. I have used file("filename") to read and storing them in an array for calculations like no of counts etc..

Any help is really appreciated.

Thanks
Babloo

mincklerstraat

4:43 pm on Oct 11, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



You're probably not going to like this answer, but: it would probably help to try putting these records in a database instead That way, you can simply select the records you need, or count them, without having to actually read in the entire contents of your file. Don't feel silly about having gotten started on the wrong foot in this way - I've also gone down that road and gotten impossible execution times for relative simple proceedures - the only problem was way, way too many records.

For as long as you must keep to the current text-file alternative, you can try
ini_set('max_execution_time', 90); at the beginning of your script (won't work if your host is in safe mode). This will increase that max execution time to 90 seconds - and if this isn't enough, you can increase this value to what's needed (and not likely to get you or your server in trouble).

It's also possible with fread to only read your file only up to a given length, but this is likely to complicate things, and if you have to process the whole bit anyways, it might not help.

babloo

4:32 am on Oct 12, 2004 (gmt 0)

10+ Year Member



Yes, got your point. Thanks for your valuable comments. I will try to import to the database and do the calculations.

Thanks