I have a SaaS CRM / Sales Management application built primarily on PHP/MySQL with a smattering of JavaScript. My customers use it to manage their businesses. One of the functions is an export of their data which can be displayed on their web sites using cURL.
The problem is, depending on the size of their data extract, the MySQL calls /loops can take a while to run so the user is waiting 30 seconds or more for the data to display. The data really only needs to be "current" every 12-24 hours or so so my strategy is to create a script which runs as a cron job once per hour and creates a simpler table that the web site extracts can read very quickly.
However, given the number of my customers and the amount of their data, it takes a while for this cron job to run, 2.2 minutes on my development server, a MacBook Pro. But it take a lot longer on my hosted VPS server. It times out after about 1.7 minutes and only produces about half the data.
My hosting company won't allow access to php.ini because the server host multiple VPS accounts. Instead, they use a "phprc" file. I have included the following line in my "phprc" file: "max_execution_time = 1500" but the script still times out in less than 2 minutes.
The hosting company's response to my support question is "my best recommendation is to change the PHP mode from FastCGI to CGI, the reason for this change is that FastCGI optimizations have a fixed execution time that can bypass the max_execution_time value"
This seems ridiculous. From what I have read, reverting to CGI will have terrible performance implications for the site generally.
Am I on totally the wrong track? Should I switch to a dedicated server? Any suggestions?