Forum Moderators: phranque

Message Too Old, No Replies

Automatic backups on Linux/Apache

         

mr_nabo

12:13 pm on Dec 4, 2009 (gmt 0)

10+ Year Member



Hi,

I've been developing sites for a little while now and have always backed up the sites manually (unless it's a CMS that has the option for a client to make their own backups). However, I'm wondering whether I could set up backups automatically?

I often backup the database every couple of days and the site files every 1 or 2 weeks. Am I right in thinking that a 'cron' job is for backing up databases regularly and maybe emailing them to something like a Google account?

If I'm right about cron jobs being for db's, then is there a way to make backups of the files regularly as well and store them on the server?

Thanks for any advice,

mn

jdMorgan

5:20 pm on Dec 4, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yes, this is a very common use of cron.

Two caveats though:

1) Make sure that you use a method so that you will notice immediately if the cron job fails to run on schedule. It would be a pity to have a problem with your server (for example, a major hack attack) and then find out that your automatic backup scripts failed or stalled two months ago...

2) Don't rely only on backups kept on your server itself. These backups could get hacked/deleted/corrupted as well (all hard drives fail eventually). Therefore it would be a good idea to make sure that *all* files are backed up on multiple servers, or on your server and on your own workstation(s).

You can likely get some ideas by searching for "scripts backup cron" and related terms -- Look at the major PERL and PHP script library sites as well as doing general Web searches.

Jim

mr_nabo

5:56 pm on Dec 4, 2009 (gmt 0)

10+ Year Member



Thanks jdMorgan, much appreciated. I'll look into a script that works for me

yaix2

8:40 am on Dec 9, 2009 (gmt 0)

10+ Year Member



I am using two Shell scripts and two cron jobs, one on the server to backup databases, one on my local Ubuntu machine to fetch all changed files and database updates from the server regularly. Here is my setup:

1. Create a shell script on the server to make backups of your database, "/home/admin/backup/make_db_backup.sh" for example. I keep always five days of backups rotating on the server. The first six lines rotate the old gzipped backup files.


#!/bin/sh
rm -f /home/admin/backup/db.sql.tar.gz.05
mv -f /home/admin/backup/db.sql.tar.gz.04 /home/admin/backup/db.sql.tar.gz.05
mv -f /home/admin/backup/db.sql.tar.gz.03 /home/admin/backup/db.sql.tar.gz.04
mv -f /home/admin/backup/db.sql.tar.gz.02 /home/admin/backup/db.sql.tar.gz.03
mv -f /home/admin/backup/db.sql.tar.gz.01 /home/admin/backup/db.sql.tar.gz.02
mv -f /home/admin/backup/db.sql.tar.gz /home/admin/backup/db.sql.tar.gz.01
rm -f /home/admin/backup/db.sql
mysqldump -u dbuser -p12345 --databases db > /home/admin/backup/db.sql
tar -cpzf /home/admin/backup/db.sql.tar.gz /home/admin/backup/db.sql

2. Then run that script on the live server from cron each day. Pick a low traffic time for the run. (here you could also make it send you a mail in case it fails or on success.)

30 8 * * * /home/admin/backup/make_db_backup.sh

3. Each day, the gzipped version of the new database backup file is downloaded to my local machine, together with all other files that changed after the last backup.

To download all changed files, I use rsync (including the "--delete" that means that rsync will delete all files in the target directory that are not on the source directory, so be careful while setting up the system! With "--exclude" you can have certain directories not sync'ed, like sessions etc.) If your host supports authentication by public/private key, you can use a cronjob for this. Otherwise you need to start it manually every day and type in your SSH password.

rsync -avP --delete --exclude=sessions webx@xx.xx.xx.xx:~/html/ /home/yy/Websites/webx

4. Make a shell script to download your database backup. I install the download then automatically on my local machine as a mirror of the live system. That way I know that the backup files actually worked and are not corrupt, and you can try out things locally that will be overwritten the next day by the new backup. I keep 3 days of old database backups locally.


#!/bin/sh

# Rotate backups
rm -f /home/yy/Websites/backup/3-days-ago/*.sql.tar.gz
mv -f /home/yy/Websites/backup/2-days-ago/*.sql.tar.gz /home/yy/Websites/backup/3-days-ago/
mv -f /home/yy/Websites/backup/1-day-ago/*.sql.tar.gz /home/yy/Websites/backup/2-days-ago/
mv -f /home/yy/Websites/backup/*.sql.tar.gz /home/yy/Websites/backup/1-day-ago/

# Download today's db backup
scp admin@xx.xx.xx.xx:~/backup/*.sql.tar.gz /home/yy/Websites/backup/

# Import database
cd /tmp
tar -xf /home/yy/Websites/backup/db.sql.tar.gz
mysql -u root -p12345 < /tmp/home/admin/backup/db.sql
rm -rf /tmp/home/admin/backup/db.sql
cd ~

5. Finally, you need a local cron job to call that script every day (but give your backup script on the live server some time to finish the database backup first!). I call the script manually, because my host does not support login by public/private key, so I need to type in my SSH passwords every time.

sh /home/yy/Websites/backup/get_db_backup.sh