Forum Moderators: coopster
function accesses LOCAL database for records in cart table.
Runs a while loop for each LOCAL record in cart table
WITHIN that loop, a new REMOTE db connection is established and writes a record to a REMOTE DB.
So, basically, I have a function that is bouncing back and forth between database servers, local and remote.
How, in a perfect world, would one handle this?
Heres how i'm doing it now, but i feel like its kludgy:
$dbh = connect_to_db_local();
where loop
write_order record($data_to_write); - this function starts with $dbh = connect_to_db_remote(); and ends with mysql_close();
is this the only way to handle this, or is there a more graceful way? if you guys need the whole code, i'll post it.
If you want to get fancy, you could delay your remote update requests until you've finished the while loop, then submit them all at once. If they're INSERTs, you might use a MySQL bulk insert statement. Something like:
$ordersToWrite = array();
open_local_db();
while (localResults...) {
$ordersToWrite[] = array(orderid => '...', customerid => '...', ...);
}
close_local_db();if (! empty($ordersToWrite)) {
$sql = 'INSERT INTO ORDERS (orderid, customerid) VALUES ';
$bulk = array();
foreach ($ordersToWrite as $row) {
$bulk[] = " ($row['orderid'], $row['customerid']) ";
}
$sql .= implode(',', $bulk);
open_remote_db()..
//etc.
}
>>Just make sure your script doesn't time out.
This would be my biggest concern as well. Maybe make your local script a transactional operation and if the remote write returns successfully, COMMIT the local transaction.