Forum Moderators: phranque
I'm using the crontab line :-
*/2 * * * * wget [#*$!.co.uk...]
This works fine, but the wget runs for nearly 60 seconds (at 95% cpu) whilst it resolves the web address.
I tried running it as a command line entry within crontab eg :-
*/2 * * * * /usr/bin/php q /home/sites/#*$!x/web/#*$!x.php
and this was quick as you like except that the command line version of php has not been compiled with all the modules the process needs, so when it has to action something (like imagecreatefromjpeg) it can't find the module.
I dont want to try recompiling the command line version of php incase I screw something else up.
Is there a way of combining the two concepts and cut out the need to resolve the web address?
Cheers
Andy
Try experimenting with something like this:
If your crontab is a USER crontab, perhaps this extra login may help to pick up the environment:
*/2 * * * * /usr/bin/bash --login -c /usr/bin/php q /home/sites/x/web/script.php
In case of a root crontab, try this
*/2 * * * * su - USER -c /usr/bin/php q /home/sites/x/web/script.php
to let the cron log in as USER and getting the full environment from the USER's profiles.
Untested, from memory, so YMMV.
HTH and HAND.
It turns out that the crontab job running under root was putting a incremental log into the root directory each time it ran the job, for example test.php.#*$! where #*$! is an incremental number.
Because the job was running every minute the number of files was enormous and it was obvioulsy having trouble allocating the incremental number (or something like that) when I deleted the (empty) files the job went back to running properly, so now I just have to do some weekly housekeeping on that directory (a cron job)
Regards
Andy