Forum Moderators: coopster

Message Too Old, No Replies

Parse data from text file into MySQL

         

Stu_Rogers

8:10 am on Aug 7, 2008 (gmt 0)

10+ Year Member



A third-party service is uploading a text file to our server. I need to write the PHP to extract the data so I can insert it into a MySQL database.

I suspect a combination of PHP string functions will be required but I am concerned about structuring them in the most efficient way because the file may contain 1000s of rows and will be processed every night.

The txt file structure:

#HEADER#
EOF : '¦'
EOR : '~'
#DEFINITION#
ADDRESS_1¦POSTCODE1¦FEATURE1¦SUMMARY¦PRICE¦PUBLISHED_FLAG¦~
#DATA#
107 West Street¦DA2 1AW¦great location¦part furnished with cooker,washing machine, fridge freezer¦550¦0¦~
#END#

Which functions will I need?
fopen to open and read the file?
substr to isolate the data section?
explode find rows and find individual fields?

Thanks in advance for any assistance.

Stu_Rogers

2:43 pm on Aug 7, 2008 (gmt 0)

10+ Year Member



Ok, I've figured most of it out myself but would appreciate if anyone could take a look and advise whether this is the best method:

$input = file_get_contents("NETH.txt");

$data_outer = substr($input, (stripos($input, "#DATA#")), (stripos($input, "#END#")));

$data_inner = substr($data_outer, 6, -1);

$rows = explode("~", $data_inner);

$number_rows = count($rows)-2;

for ( $i = 0; $i <= $number_rows; $i++) {

$fields = explode("¦", $rows[$i]);

//insert each array part into database here

}

jatar_k

3:08 pm on Aug 7, 2008 (gmt 0)

WebmasterWorld Administrator 10+ Year Member



when you say "the file may contain 1000s of rows" you should test it with data sets larger than you what you think you might get

if it works, then it's a good way to do it, test for speed and go from there