Forum Moderators: coopster

Message Too Old, No Replies

importing to mysql without LOAD DATA LOCAL

anyone have a nice routine handy?

         

amznVibe

5:17 am on Nov 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I've just gone through the learning curve of realizing most virtual hosts have "LOAD DATA LOCAL" disabled :(

I have a two meg delimited ascii file I need to import on a regular basis (26 fields)

Anyone have a nice import routine using fopen they are willing to share before I attempt to hack one out myself? I can't seem to find such an animal out there (though some are close).

Such a shame, from what I have read, load data is much faster...

Thanks for any help!

jatar_k

5:28 am on Nov 25, 2003 (gmt 0)

WebmasterWorld Administrator 10+ Year Member



can you not use LOAD DATA at all then?

amznVibe

7:25 am on Nov 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Uncertain, but I think you are implying I could do it with a trick from the command line? I cannot do that technique as this needs to be automated from the web (for novices).

amznVibe

7:30 am on Nov 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I think I almost have it using fopen and fgetcsv.
What I can't figure out is why it's reading past the \n\r and into the next line, overflowing my mysql table with "error - Column count doesn't match value count at row 1".
There are 26 fields ending with a carriage return, doesn't fgetcsv figure that out by itself?

set_time_limit(5*60);
$filename = "resources.txt";
$delim =",";
$f = fopen ($filename,"r");
while ($data = fgetcsv($f, 2000, $delim)) {
while (list($key,$val) = each($data)){
$data[$key] = addslashes($val);
}
$insert = "INSERT into ".$sql_tbl." VALUES('".implode("','", $data)."')";
$result = mysql_query($insert) or die("error - ".mysql_error());
}
fclose ($f);

amznVibe

7:55 am on Nov 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Ah bingo, its a mac bug. The original data file came off a (%$@#!) mac which uses /r instead of /n. So I resaved the file as dos/win to convert them, and now the above routine works perfectly.

The problem is, these folks will be uploading from macs and I don't want them to have extra steps. This datafile will only be growing in size so I don't want to convert in ram or save to disk. Any super-creative ideas anyone?

jatar_k

5:46 pm on Nov 25, 2003 (gmt 0)

WebmasterWorld Administrator 10+ Year Member



hm, trying to think of ways to handle it. Maybe taking the returned fields and grabbing a set number and putting the overflow into a seperate array for that iteration then putting it back together on next read.

Getting them to save as a PC file, all Mac programs I have worked with have options for default line endings. Often it says Mac/PC/Unix in the save as.

replacing chars but the file is too big for that.

just thinking out loud, I was looking at different functions like fgets but couldn't find anything I really liked.

Did you get any farther or maybe someone else has some ideas.

amznVibe

6:45 am on Nov 26, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Still no luck finding an automated way to do the \r to \n conversion but I am darn happy with the importing code above. Does 10,000 rows of 26 fields in a few seconds (with other server load). The nice thing is I can add additional field processing on the fly too, like convert bad mac filemaker/word characters (curly quotes etc). (but that slows down processing alot unfortunately, oh well)