Forum Moderators: coopster

Message Too Old, No Replies

Worried that my script will time out

         

itledi

10:53 am on Jan 29, 2008 (gmt 0)

10+ Year Member



I wrote a script activated by a cron job to use the file_get_contents function on about 20 urls each day, monitoring if there have been any changes.

The script has a loop function to start at the first address in the list, run a function, then move to the next address. At the end it uses fopen and fwrite to update a report on my website

It works great, but it's slow. Personally, I'm not too concerned about it being slow, it's a cron job that runs this script, its not like I have to wait the 20 seconds for it to complete. However, I am concerned that under some circumstances, or if I add more addresses to monitor, that my function will time out and never check some parts of my list.

I know I can set it up to run two cron jobs, and split the list in half, but this seems only like a bandaid to my problem, and does nothing if a website on my list on some day just happens to be sluggish. Is there a better way for me to address this concern?

PHP_Chimp

11:59 am on Jan 29, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Instead of using file_get_contents and checking the actual contents could you not use filectime [uk2.php.net],
or filemtime [uk2.php.net], depending on what you are looking for. Im guessing filemtime is what you want.
As this should be a lot faster.

whoisgregg

2:25 pm on Jan 29, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I suspect that itledi is using file_get_contents on URLs from other websites... I'm not sure that filemtime or filectime will behave reliably with off-site URLs. If those other sites are dynamic, they may always return a modified date of now without their being any actual changes.

You can pass file_get_contents a context [php.net] to set a specific timeout for the each request to wait. That way, slow URLs won't hang the script forever.