Forum Moderators: coopster

Message Too Old, No Replies

Running Long Scripts

script time out, max_execution time

         

SeanF

2:03 pm on Jan 16, 2022 (gmt 0)

5+ Year Member Top Contributors Of The Month



I have a SaaS CRM / Sales Management application built primarily on PHP/MySQL with a smattering of JavaScript. My customers use it to manage their businesses. One of the functions is an export of their data which can be displayed on their web sites using cURL.

The problem is, depending on the size of their data extract, the MySQL calls /loops can take a while to run so the user is waiting 30 seconds or more for the data to display. The data really only needs to be "current" every 12-24 hours or so so my strategy is to create a script which runs as a cron job once per hour and creates a simpler table that the web site extracts can read very quickly.

However, given the number of my customers and the amount of their data, it takes a while for this cron job to run, 2.2 minutes on my development server, a MacBook Pro. But it take a lot longer on my hosted VPS server. It times out after about 1.7 minutes and only produces about half the data.

My hosting company won't allow access to php.ini because the server host multiple VPS accounts. Instead, they use a "phprc" file. I have included the following line in my "phprc" file: "max_execution_time = 1500" but the script still times out in less than 2 minutes.

The hosting company's response to my support question is "my best recommendation is to change the PHP mode from FastCGI to CGI, the reason for this change is that FastCGI optimizations have a fixed execution time that can bypass the max_execution_time value"

This seems ridiculous. From what I have read, reverting to CGI will have terrible performance implications for the site generally.

Am I on totally the wrong track? Should I switch to a dedicated server? Any suggestions?

jay5r

5:41 pm on Jan 16, 2022 (gmt 0)

10+ Year Member Top Contributors Of The Month



Can you do what you need done in batches? Let's say you have 100,000 things that need to get done. So every X minutes you do the oldest 10,000 of them. After 10 iterations you've done them all.

SeanF

8:32 pm on Jan 16, 2022 (gmt 0)

5+ Year Member Top Contributors Of The Month



Thanks... That was something I was considering... different cron job for each of my customers

robzilla

2:50 am on Jan 17, 2022 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



If you have a VPS, why don't you run the script directly through the command-line interface (php-cli) instead of through the web server?

So you'd have something like: @hourly /usr/bin/php -f /path/to/script.php

No time constraints that way, and you're not blocking a web server thread with the cronjob.

NickMNS

3:50 am on Jan 17, 2022 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Another option, is instead of creating a new table each time, just keep the "simpler" table and then each time there is a relevant write to the "complex" table, immediately update the simpler table too. You will likely take a small performance hit on each write, but it probably wont be noticeable. Then you will have the data available immediately for the user when it is needed.

The other benefit to this approach is that it isn't impacted by the size of the data. Say you have 1M records now and it takes 2 minutes, when that data grows to 10M, your batch process will take 20 minutes (assuming it's linear). The downside to the approach is that there is a cost in terms of additional storage space since some data will be duplicated. But everything is a trade-off.