Forum Moderators: coopster
If I just grab an array of database login info (username, pass, hostname) and run a mysql dump against it using a PHP loop to iterate through, they all have to "wait their turn"-- each dump happens sequentially, and if for some reason there is a huge amount of data in one, or a large number of dbs to process, it could take waaay longer than if there is a way to multi-thread the processes, and kick off multiple dumps at once.
Can anyone advise me on this? Is it worth doing in php or should I try Python or something else that is more "batch friendly", or is PCNTL the ticket for this type of need?
Thanks!
- You can "fork" in PHP (see pcntl_fork function).
- Use a system call (see system or exec functions) to run a secondary PHP script that handles the individual dump. This should use slightly less resources than forking.
- Use non blocking sockets to run the secondary PHP script from above. This would make sense if you wanted to spread the load across multiple servers.