Forum Moderators: phranque
assuming home dir is /home/frank
create a dir structure
backups
backups/daily
backups/weekly
backups/latest
I also have a dir called
batch
where I put the batch files. A crontab is run each night thus:
0 4 * * 0 /home/frank/batch/zip_weekly
0 4 * * 1-6 /home/frank/batch/zip_daily
What this does is a daily backup each day, but on sundays a weekly backup. Now you need to know what is in those batches right?
zip_daily
----------
cd /home/frank/backups
tar -cf website.tar /home/frank/ --exclude=*.tar --exclude=*.ZIP --exclude=*.zip --exclude=*.tar --exclude=*.gz
gzip website.tar
cp /home/frank/backups/website.tar.gz /home/frank/backups/weekly/`date +%d%m%y`.tar.gz
mv /home/frank/backups/website.tar.gz /home/frank/backups/latest/website.tar.gz
----
What this does is to first create a tar file (1 single file holding complete site) of your site. Note this is not only backing up your public_html but all dirs under your home dir including database, hidden from world stuff etc.
I exclude existing tar and zip files so that we dont backup backup files.
Next, the file is compressed using gzip.
The file is then copied to the daily dir and renamed to todays date .e.g. 050702.tar.gz.
Lastly, the file website.tar.gz is moved to the latest dir so that we know where the latest good backup is
zip_weekly
----------
cd /home/frank/backups
tar -cf website.tar /home/frank/ --exclude=*.tar --exclude=*.ZIP --exclude=*.zip --exclude=*.tar --exclude=*.gz
gzip website.tar
cp /home/frank/backups/website.tar.gz /home/frank/backups/weekly/`date +%d%m%y`.tar.gz
mv /home/frank/backups/website.tar.gz /home/frank/backups/latest/website.tar.gz
----------
Similar to above but we want to shove this file in the weekly dir instead.
Another useful tip is to use your windows scheduler to run a PSFTP dos bactch file to grab the website.tar.gz from the backups/latest directory each morning.
I have my scheduler set to run every morning to download the complete website in zipped format. It gives me a good offline copy of the site each day.
On other sites, I "reverse" the process. My copy of the site on local disk, is the live copy. On the live site, I force all dynamic data possible in to one directory. Then all there is to backing up the site, is grabbing that dynamic directory (drag and drop the folder in the ftp client).