Forum Moderators: coopster
I've got a table with about 10,000 rows in it.
I need to go through each row, manipulate the data, and then insert it into another table.
I'm simply doing the typical:
$query = "SELECT * FROM table";
$result = mysql_query($query,$db);
while ($myrow = mysql_fetch_array($result)) {(Process it and insert it into the new table)
}
and reading all the rows in the $myrow array, then going through each one and INSERTing it into the new table.
Problem is that it's taking around 10 minutes to run (browser page goes white and won't flush() anything to the screen during this time).
Note I'm running it on a local installation of Appache/mysql/php on XP, and yes I've really had to whack up the timeout values.
Am I doing something intrinsically wrong? Should I be tackling this in an entirely different manner?
Your advice appreciated!
[edited by: Markos at 11:42 am (utc) on Nov. 8, 2007]
The processing is as varied as saying "if it's brand X then apply a formula to field Z" up to the formula for generating barcode numbers. There's about 400 lines of PHP carrying out the "processing".
Next you should look at your query
and index all col that are queried by WHERE clause
but there is much more to it
read about: MySQL OPTIMIZATION [dev.mysql.com]