Forum Moderators: coopster
I am writting a script that needs to read in a rather large flat file database (.txt). It is about 300k and contains approx 15,000 records (ie \n lines)
I was going to simply use include()
$db = include("flat-file.txt");
And then word with $db.
However, is there a better way to do this? [faster, more scalable] Such a fopen(), etc...
Thanks!
Look at the "load data infile" command
[dev.mysql.com...]
Here is an example:
load data infile 'flat-file.txt' into table your_table fields terminated by ',' enclosed by '"' lines terminated by '\n';
Here is what I am doing with the file.
function1 - count number of lines in substr_count($var, "\n")
Take total number of lines, and create a page with links to the "sub-categories" (each sub will consist of 20 lines from the text file. So, $total / 20).
function2, take items from files based on math from function1. link 1 would produce a page with the 1 sub category (ie the 1st 20 entires in the flat file db)
As I said, there are about 15,000 entries in this db. Is flat file going to be to slow?
THanks