Welcome to WebmasterWorld Guest from 107.23.37.199

Forum Moderators: coopster & jatar k & phranque

Message Too Old, No Replies

Custom backup script not working in cron

Half works, half doesn't

     
6:27 pm on Feb 23, 2004 (gmt 0)

New User

10+ Year Member

joined:Nov 12, 2003
posts:30
votes: 0


I've written a website backup script from scratch. First, it runs a mysqldump with backticks, then it creates a tarball of the whole site.

Running this from SSH works fine, and running this through http works fine too. No errors, no problems, no fuss, no muss.

When it runs automatically from cron, the mysqldump fails, though I can't seem to catch the error message.

Here's some details:

Cron line:

56 5 * * * cd /usr/home/mydirectory/htdocs/cgi-bin/backup; perl backup.pl

Mysqldump line (backticks)

`mysqldump -u username --password=password databasename > /usr/home/mydirectory/htdocs/cgi-bin/backup/files/filename.txt`;

Tarball line (backticks)

`tar czpvf /usr/home/mydirectory/htdocs/cgi-bin/backup/files/backup.tar.gz /usr/home/mydirectory/htdocs`;

Our webhost is offering no support on this, and I have a hunch it's a difference of permissions vs. cron, but both the files and the backup directories are chmod 755'd.

(directory names, user names and passwords changed to protect the innocent)

Any help is appreciated.

Thanks,

-- Lax

9:45 pm on Feb 23, 2004 (gmt 0)

Preferred Member

10+ Year Member

joined:Aug 7, 2003
posts:408
votes: 0


Instead of:

56 5 * * * cd /usr/home/mydirectory/htdocs/cgi-bin/backup; perl backup.pl

try this:

56 5 * * * /path/to/perl/backup.pl

12:02 am on Feb 24, 2004 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Oct 4, 2001
posts:1277
votes: 17


Good suggestion but...

You want path/to/perl (space) path/to/script/scriptname.pl

Also add an email addy to the cron job so you can see what kind of error is happening.

1:11 pm on Feb 24, 2004 (gmt 0)

Junior Member

10+ Year Member

joined:Sept 16, 2000
posts:122
votes: 0


Running this from SSH works fine, and running this through http works fine too. No errors, no problems, no fuss, no muss.

A very simple shell script should work. Just type the commands exactly the same as you use through SSH and save them to a file, for example

tar czpf /home/mysite/backup.tar.gz /home/mysite
mysqldump -u user -pmypasswd mydb > /path/to/mydump.txt

You don't need a perl script, backticks or semi-colons.
Also the v option in tar isn't needed.

If the file is executable it should work with the following

56 5 * * * /path/to/my/backup

Each line will be executed when the script is run.

12:12 am on Feb 26, 2004 (gmt 0)

New User

10+ Year Member

joined:Nov 12, 2003
posts:30
votes: 0


- Tombola: I have tried it that way as well

- Ian: I'll add that e-mail address right now

- Gorufu: I thought about that, but I'm timestamping the filenames. I know there's a way to do that using the *nix date() function, but it wouldn't have both timestamps the same for mysqldump and tarball. I'll remove the "v" flag from the command, thanks.

I know I'm being particular about this, but I want a rolling backup history of about a week at a time. This way, I can remove the "old" backups from the directory by timestamp. That's why I'm running this from a perl script.

Thanks guys, and I'll let you know about the little changes.

-- Lax

12:31 am on Feb 26, 2004 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:July 1, 2002
posts:1427
votes: 2


A very simple shell script should work. Just type the commands exactly the same as you use through SSH and save them to a file, for example

Don't be so sure.
Depending on the configuration of the server, cron will not use the same environment variables that are set when you run the shell of your choice. Of those variables, PATH causes the most problems.

Laxters, try using full path to mysqldump, or set appropriate PATH in your cron tab file.

10:06 pm on Feb 26, 2004 (gmt 0)

Junior Member

10+ Year Member

joined:Sept 16, 2000
posts:122
votes: 0


Laxters,

Hoefully this example script will work. It works for me on RH 9
==============================
#!/usr/bin/perl

$dir = "/home/myname";

@months = ('Jan','Feb','Mar','Apr','May','Jun','Jul','Aug','Sep','Oct','Nov','Dec');
($sec,$min,$hour,$mday,$mon,$year,$wday) = (localtime(time))[0,1,2,3,4,5,6];
$year += 1900;
$date = "$months[$mon]-$mday-$year";

# Example saved files Feb-26-2004.sql Feb-26-2004.tar.gz

`mysqldump -u myname -pmypass mydb > $dir/backup/$date.sql`;
`tar czf $dir/backup/$date.tar.gz --exclude="backup" $dir`;

exit;
==============================================

--exclude="backup" (excludes your existing backup files, assuming they were saved in a directory named backup).

chmod the backup script to 700 and it should work in any directory within your user area.

Edit your crontab to the following
56 5 * * * /path/to/my/backup.pl