Forum Moderators: coopster
The script has a loop function to start at the first address in the list, run a function, then move to the next address. At the end it uses fopen and fwrite to update a report on my website
It works great, but it's slow. Personally, I'm not too concerned about it being slow, it's a cron job that runs this script, its not like I have to wait the 20 seconds for it to complete. However, I am concerned that under some circumstances, or if I add more addresses to monitor, that my function will time out and never check some parts of my list.
I know I can set it up to run two cron jobs, and split the list in half, but this seems only like a bandaid to my problem, and does nothing if a website on my list on some day just happens to be sluggish. Is there a better way for me to address this concern?
You can pass file_get_contents a context [php.net] to set a specific timeout for the each request to wait. That way, slow URLs won't hang the script forever.