Forum Moderators: phranque

Message Too Old, No Replies

crontab problem

job running too slow

         

adwhite

2:58 pm on Dec 17, 2007 (gmt 0)

10+ Year Member



I have a php job running under crontab which actions every 2 minutes to see if there is anything to do. If there is, the job does the process, if there isn't it closes.

I'm using the crontab line :-
*/2 * * * * wget [#*$!.co.uk...]

This works fine, but the wget runs for nearly 60 seconds (at 95% cpu) whilst it resolves the web address.

I tried running it as a command line entry within crontab eg :-

*/2 * * * * /usr/bin/php q /home/sites/#*$!x/web/#*$!x.php

and this was quick as you like except that the command line version of php has not been compiled with all the modules the process needs, so when it has to action something (like imagecreatefromjpeg) it can't find the module.
I dont want to try recompiling the command line version of php incase I screw something else up.

Is there a way of combining the two concepts and cut out the need to resolve the web address?

Cheers

Andy

Romeo

4:50 pm on Dec 17, 2007 (gmt 0)

10+ Year Member



Your script apparently needs to run in a known environment where it knows about all paths necessary to find additional libraries, etc.

Try experimenting with something like this:

If your crontab is a USER crontab, perhaps this extra login may help to pick up the environment:
*/2 * * * * /usr/bin/bash --login -c /usr/bin/php q /home/sites/x/web/script.php

In case of a root crontab, try this
*/2 * * * * su - USER -c /usr/bin/php q /home/sites/x/web/script.php
to let the cron log in as USER and getting the full environment from the USER's profiles.

Untested, from memory, so YMMV.
HTH and HAND.

adwhite

10:48 am on Jan 7, 2008 (gmt 0)

10+ Year Member



just an update to close the thread...

It turns out that the crontab job running under root was putting a incremental log into the root directory each time it ran the job, for example test.php.#*$! where #*$! is an incremental number.
Because the job was running every minute the number of files was enormous and it was obvioulsy having trouble allocating the incremental number (or something like that) when I deleted the (empty) files the job went back to running properly, so now I just have to do some weekly housekeeping on that directory (a cron job)

Regards

Andy