Msg#: 3582749 posted 11:42 am on Feb 23, 2008 (gmt 0)
I have a probem with a large SQL dump its just over 1M records large however whats worse is that each INSERT has 7000(ish) records so its 1M odd records done within 200 INSERT statements.
I can't even import them one at a time, I tried BigDump but that was also no help. I tried inserting one insert directly into the sql window of phpmyadmin however got a timeout error (I don't fancy doing 200 of these anyway)
Msg#: 3582749 posted 5:23 pm on Feb 25, 2008 (gmt 0)
for a database import of that size your going to have to go to the command line.
And a single insert with 7000 rows sounds a bit ott.
I would maybe think about dumping the data to a flat file/files and then doing a batch update. Just so i could be sure that all the records went across ok - ok I used to write biiling systems so i'me a bit anal about data integrity :-)
Msg#: 3582749 posted 11:49 pm on Feb 26, 2008 (gmt 0)
Something that may work for you depending on your situation is to stop using mysqldump and start using mysqlhotcopy [dev.mysql.com]. It's much easier and quicker to backup and restore...but you must backup within the same machine. It also has some advantages and disadvantages to mysqldump.
[edited by: Clark at 11:49 pm (utc) on Feb. 26, 2008]