Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: open
I have a probem with a large SQL dump its just over 1M records large however whats worse is that each INSERT has 7000(ish) records so its 1M odd records done within 200 INSERT statements.
I can't even import them one at a time, I tried BigDump but that was also no help. I tried inserting one insert directly into the sql window of phpmyadmin however got a timeout error (I don't fancy doing 200 of these anyway)
If anyone could help that would be great.
And a single insert with 7000 rows sounds a bit ott.
I would maybe think about dumping the data to a flat file/files and then doing a batch update. Just so i could be sure that all the records went across ok - ok I used to write biiling systems so i'me a bit anal about data integrity :-)
[edited by: Clark at 11:49 pm (utc) on Feb. 26, 2008]