Forum Moderators: coopster

Message Too Old, No Replies

Inserting data from MySQL Table to another MySQL Table

         

shilmy

8:48 am on Feb 15, 2008 (gmt 0)

10+ Year Member



Hello,

I have one mysql table with about 16.000 records. I need to get a subsets of those data (about 3000 record) entered into another table.

I have write a php script to query data from first table, and using while loop to process those data one by one and inserted to second table.

When I run the script locally in my notebook, it only insert about 1000 records.

When I run the script in my host (share hosting), it only insert about 100 records.

So, is there a trick to do this kind a job? Is those numbers of records too big for mysql or php?

Looking forward to hear your insight.

Thanks
Sjarief

RonPK

10:05 am on Feb 15, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Maybe you could use the
INSERT ... SELECT
syntax which is made exactly for these situations:

INSERT INTO tbl2 (id, name) 
SELECT tbl1.id, tbl1.name
FROM tbl1 WHERE tbl1.name LIKE '%jack%';

shilmy

10:25 am on Feb 15, 2008 (gmt 0)

10+ Year Member



I have thought about using INSERT SELECT syntax, but I need to process several column using PHP function first before inserted into second table.

Additional message...

Ok I got it, I will use the INSERT SELECT, and will process the data with PHP from the second table and update the table.

Is that good solution? Or there are a better solution?

RonPK

10:36 am on Feb 15, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



OK.
Multiple INSERTs can often be done faster by locking the table:

mysql_query('LOCK TABLES mytable WRITE'); 
while (..){
// inserts here
}
mysql_query('UNLOCK TABLES');

You could also set the script execution time limit to infinite (

set_time_limit(0);
). That way it won't stop after the default 30 seconds.

shilmy

3:33 am on Feb 16, 2008 (gmt 0)

10+ Year Member



I've change the script, now I use Insert Select. It work flawlessly. All query results inserted to second table.

But the problem still happen when I try to update the records in second table. Here is the script that I use.

-----

$select_query = "select * from second_table ;
$result = mysql_query($select_query);
while ($row = mysql_fetch_array($result))
{
extract($row);
.....
/* Do Some Proccessing */
.....

$update_query="update second_table set Column_1='$Column_1', Column_2='$Column_2' where Record_ID='$Record_ID'" ;
$update_result = mysql_query($update_query);
}

-------

The script always time out before processing all the records.

I try to lock as suggested, but it didn't help. It still time out.

I use set_time_limit(0) and try it locally on my notebook and it work, all the records were processed and updated. But I haven't try it on my host. Do you think it's OK to run it on shared hosting environment?

RonPK

8:45 pm on Feb 16, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'm not on your server :)
I guess it shouldn't be a problem. A decent server should be able to handle a short peek.

Another trick might be to temporarily disable any keys on the table. Your INSERTs and UPDATEs probably take forever because MySQL needs to rebuild the index after each INSERT or UPDATE. Use

ALTER TABLE ... DISABLE KEYS
(manual [dev.mysql.com]) and
... ENABLE KEYS
when you're done. It only works on non-unique keys, for some reason.