homepage Welcome to WebmasterWorld Guest from 54.226.213.228
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Hardware and OS Related Technologies / Linux, Unix, and *nix like Operating Systems
Forum Library, Charter, Moderators: bakedjake

Linux, Unix, and *nix like Operating Systems Forum

    
Cron Job Problem
skeddy




msg:914425
 1:16 pm on Jan 6, 2005 (gmt 0)

Hi there,

I'm new to using Cron Jobs on my website, which is hosted by One and One.

I have done some research, and have come up with a backup.php script that backs up my Mysql database when ever it is run.

I have also tried to set-up a Cron Job that will run it every 12 hours (currently set for every 1 min, testing only!), but I am getting an error message.

The Cron job is set up as follows:

MAILTO=rob@example.com
*/1 * * * * /usr/local/bin/php -f /kunden/homepages/33/d85254132/htdocs/SQLBackup.php

The backup.php page runs the following:


#!/usr/local/bin/php
<?php
$file = "Databasebackup-".Date("m-d-y-H-i-s");
@exec("mysqldump -h DBHOST -u DBUSERNAME -pDBPASSWRD DB gzip > /homepages/33/d85254132/htdocs/mysql_backup/".$file.".txt.gz");
echo "Back up job complete"
?>

The backup.php page dumps the database into a zip file, and I can run that from a browser, mobile phone, anywhere, and it does the job.

But when I use the above Cron job, I get the following message from the Cron Deamon:


X-Cron-Env: <MAILTO=rob@example.com>
X-Cron-Env: <SHELL=/bin/sh>
X-Cron-Env: <HOME=/kunden/homepages/33/d85254132/htdocs>
X-Cron-Env: <PATH=/usr/bin:/bin>
X-Cron-Env: <LOGNAME=*********>
Message-Id: <E1ClAHl-0007AH-00@mrvnet.example.com>
X-Provags-ID: example.com abuse@example.com sender-info:85254132@infong255

No input file specified.

I can run /usr/local/bin/php /kunden/homepages/33/d85254132/htdocs/backupjob2.php directly from puTTy and it does the back up fine.

Does anyone have any thoughts? (Apologies if the layout above is awful!)

[edited by: rogerd at 2:02 pm (utc) on Jan. 6, 2005]
[edit reason] examplified [/edit]

 

MattyMoose




msg:914426
 5:47 pm on Jan 6, 2005 (gmt 0)

There are differences in environment when you run a cron, and when you run a script while logged in. It's typically a stripped-down shell, with a minimal $PATH, etc etc.

What I'd suggest doing is echoing everything that you're going to execute.
So, when you have your exec, echo what is actually being exec'd and so on. It may be that it doesn't know where gzip is, etc. It's usually something minor that screws up the whole thing, that's hard to track down, but easy to fix. ;)

HTH,
MM

skeddy




msg:914427
 6:02 pm on Jan 6, 2005 (gmt 0)

Thanks for the advice.

How could I go around doing this? (New to this)

If I run the PHP directly from a link, say [somesite.com...] it runs perfectly, and I get my "Back up job complete" printed out.

Is there a way I can log the CRON job to a file? (I'm on a webhost, not my own server)

MattyMoose




msg:914428
 6:18 pm on Jan 6, 2005 (gmt 0)

To echo all of what you're about to do in the script, you should do something like this:

#!/usr/local/bin/php
<?php
$file = "Databasebackup-".Date("m-d-y-H-i-s");
echo "file: $file";
$command = "mysqldump -h DBHOST -u DBUSERNAME -pDBPASSWRD DB gzip >
/homepages/33/d85254132/htdocs/mysql_backup/".$file.".txt.gz";
echo $command;
@exec($command);
echo "Back up job complete";
?>

Also, give explicit paths to mysqldump and gzip. To find out where they are, type: "whereis mysqldump" and "whereis gzip", and copy the paths into your script.

I have a feeling it's gzip that's dying, or at least the pipe between mysqldump and gzip. The hint being "no input file specified", but I'm not 100% sure. Try it without the pipe, and just putting mysqldump to a file, and gzipping the file.

nalin




msg:914429
 6:33 pm on Jan 6, 2005 (gmt 0)

To echo a job to file append "> /path/to/log.txt"

for instance
*/1 * * * * /usr/local/bin/php -f /kunden/homepages/33/d85254132/htdocs/SQLBackup.php > /path/to/log.txt

As an aside your php script is using php to "exec", or run shell commands, and by doing this your overcomplicating it. You would find it better to do the same using BASH, as then you avoid the overhead of running the thing through php.

Additionally, if you are backing up every ten minutes you would be better served to keep only the latest or a handful of the latest log files (otherwise disk space usage could add up very quickly). Using date you can tweak this - for instance YMD will give you one log per Day, YM one per month, Y one per year etc, with the latest overwriting everything. Generally I have found it best to setup a series of cron jobs - one for monthly archives that are kept semi-permanantly, the other for more regular intervals that overwrite the last run. Monthly you set a date argument for (as with your script), the other intervals you use a static file name which is overwritten on subsequent runs.

Here is a bash script (untested and likly needing minor tweaks to work) which would do roughly what your php one does - it would need be saved to file, given execute permissions ("chmod 700"), and dependant upon system setup it may be necessary to give it less restrictive permissions and ownership ("chmod 770; chown username:groupname" where username is your login and groupname would be ascertained by talking to your host provider)


#!/bin/bash
DATE=`Date +"%y%m%d"`
DBHOST=...
DBUSERNAME=...
DBPASSWORD=...
DB=...
mysqldump -h DBHOST -u DBUSERNAME -pDBPASSWRD DB gzip > /homepages/33/d85254132/htdocs/mysql_backup/Databasebackup-$DATE.txt.gz
echo "Back up job complete"

skeddy




msg:914430
 7:23 pm on Jan 6, 2005 (gmt 0)

Thanks for all the advice, I'll give them a try tonight.

Regarding the times when I run the job, I play to actully run it every 12 hours (once it works), and the times at the moment are just so the cron deamon can spit out the results.

I'll let you know how I get on. Thanks again!

MattyMoose




msg:914431
 5:50 pm on Jan 7, 2005 (gmt 0)

As an additional note, if you wanted all of the potential output from all streams (stdout and stderr rather than just stdout), use:

*/1 * * * * /usr/local/bin/php -f /kunden/homepages/33/d85254132/htdocs/SQLBackup.php > /path/to/log.txt 2>&1

I agree with nalin that you should be using a shell script rather than PHP's exec function, since you're using the shell anyway with the exec function.

Also, whenever you're calling a command, use the full path. If you're using mysqldump, then use /usr/local/mysql/bin/mysqldump, or whatever the path is. It can save a lot of trouble in the future. :)

skeddy




msg:914432
 7:12 pm on Jan 7, 2005 (gmt 0)

Hi guys,

Script is working perfectly now as a CRON job.

I followed MattieMoose's advice about changing the pipe just before the gzip command, and still had no luck.

So I then reverted back to a simple file name for the .gz file.

It worked!

So my php file now spits out the following:


#!/usr/local/bin/php
<?php
@exec("mysqldump -h DBHOST -u DBUSERNAME -pDBPASSWORD DBNAME ¦ gzip > /homepages/33/d85254132/htdocs/mysql_backup/BackUpIsWorking.txt.gz");
echo "Back up job complete"
?>

Thanks for all the help!

MattyMoose




msg:914433
 10:34 pm on Jan 7, 2005 (gmt 0)

Looking at your exec, I just realized what the problem was...

@exec("mysqldump -h DBHOST -u DBUSERNAME -pDBPASSWRD DB gzip > /homepages/33/d85254132/htdocs/mysql_backup/".$file.".txt.gz");

Needs to be:
@exec("mysqldump -h DBHOST -u DBUSERNAME -pDBPASSWRD DB gzip > /homepages/33/d85254132/htdocs/mysql_backup/$file.txt.gz");

That should do it. You don't need to join the text with the double quotes, and the function exec was ending the command you were passing to it at the 2nd quote.

That should do it, AFAICT.

MM

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Hardware and OS Related Technologies / Linux, Unix, and *nix like Operating Systems
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved