Forum Moderators: phranque

Message Too Old, No Replies

Useful Linux webserver batch files

how to do a quick and easy regular backup of your site

         

Frank_Rizzo

11:20 am on Jul 5, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I like to zip up my website each night. If I screw up any changes in the day, I can just unzip from the backup file rather than have to reload tapes etc.

assuming home dir is /home/frank

create a dir structure

backups
backups/daily
backups/weekly
backups/latest

I also have a dir called

batch

where I put the batch files. A crontab is run each night thus:

0 4 * * 0 /home/frank/batch/zip_weekly
0 4 * * 1-6 /home/frank/batch/zip_daily

What this does is a daily backup each day, but on sundays a weekly backup. Now you need to know what is in those batches right?

zip_daily
----------
cd /home/frank/backups
tar -cf website.tar /home/frank/ --exclude=*.tar --exclude=*.ZIP --exclude=*.zip --exclude=*.tar --exclude=*.gz
gzip website.tar
cp /home/frank/backups/website.tar.gz /home/frank/backups/weekly/`date +%d%m%y`.tar.gz
mv /home/frank/backups/website.tar.gz /home/frank/backups/latest/website.tar.gz
----

What this does is to first create a tar file (1 single file holding complete site) of your site. Note this is not only backing up your public_html but all dirs under your home dir including database, hidden from world stuff etc.

I exclude existing tar and zip files so that we dont backup backup files.

Next, the file is compressed using gzip.

The file is then copied to the daily dir and renamed to todays date .e.g. 050702.tar.gz.

Lastly, the file website.tar.gz is moved to the latest dir so that we know where the latest good backup is

zip_weekly
----------
cd /home/frank/backups
tar -cf website.tar /home/frank/ --exclude=*.tar --exclude=*.ZIP --exclude=*.zip --exclude=*.tar --exclude=*.gz
gzip website.tar
cp /home/frank/backups/website.tar.gz /home/frank/backups/weekly/`date +%d%m%y`.tar.gz
mv /home/frank/backups/website.tar.gz /home/frank/backups/latest/website.tar.gz
----------

Similar to above but we want to shove this file in the weekly dir instead.

Another useful tip is to use your windows scheduler to run a PSFTP dos bactch file to grab the website.tar.gz from the backups/latest directory each morning.

I have my scheduler set to run every morning to download the complete website in zipped format. It gives me a good offline copy of the site each day.

Brett_Tabke

11:18 pm on Jul 6, 2002 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Very interesting. I use something similar here, but skip the tar and just use zip.

On other sites, I "reverse" the process. My copy of the site on local disk, is the live copy. On the live site, I force all dynamic data possible in to one directory. Then all there is to backing up the site, is grabbing that dynamic directory (drag and drop the folder in the ftp client).