Welcome to WebmasterWorld Guest from

Forum Moderators: bakedjake

Message Too Old, No Replies

Shell Script to Backup Files

broken tar.gz files produced from shell script



2:49 am on May 22, 2010 (gmt 0)

5+ Year Member


I recently found this script on a website - it backs up all files on my GoDaddy website. I run it with Cron daily - works wonderfully.
The problem is, the tar.gz files produced by the script are corrupt. Is there any obvious reason for this, or any way to fix it? Or perhaps the script can be modified to simply copy the files, and not gzip them.

Thanks in advance for any help - here's the shell script:

export TERM
NOWDATE=`date +%m%d%y` # Sets the date variable format for zipped file: MMddyy
clear # clears terminal window
echo "Hi, $USER!"
echo "Beginning backup of files @ `date`"
echo "Zipping directory structure..."
tar -cvzf $HOME/html/backups/dailyback/$NOWDATE.tar.gz $HOME/html/*
echo "Backup Complete!"


4:46 am on May 22, 2010 (gmt 0)

WebmasterWorld Senior Member lammert is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

The options provided to the tar command in this script are valid. Your problem is probably the directory where the backup is stored, not the backup options. The tar command makes a backup of $HOME/html/* which is the same directory tree where you also store the backup file. What happens is that the backup tries to store the resulting backup file in itself. That is where the backup probably fails and the tar command aborts, leaving an invalid gzip file.

Some other remarks about this script: The TERM=linux, export TERM and clear commands have only a valid meaning when the command is executed from a command line. In your situation where a cron file is used they have no function, and worst case they may interfere with the automated way the script executes.

Another advice is to add the --ignore-failed-read option to the tar command line. This causes the tar command to proceed execution when a read fault occurs (for example due to open files, files deleted by other processes while the backup is running etc.).


1:09 pm on May 22, 2010 (gmt 0)

5+ Year Member


If my problem IS that it is trying to back up the backup itself, how can I exclude the folder where the backups are stored?

Thanks for the quick response.


2:14 pm on May 22, 2010 (gmt 0)

WebmasterWorld Senior Member lammert is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

The best way to solve the recursive backup problem of your backup files is to create a separate backup directory $HOME/backup and use that directory as the storage location for your files.

This will also solve the problem that with your current configuration everyone can download a complete backup of your site by just surfing with a browser to the /backups/dailyback/$NOWDATE.tar.gz path on your site.

Don't forget to move these backups on a daily basis to another server or your home computer, because a backup on the same drive/computer as the original files is practically spoken worthless.


2:29 pm on May 22, 2010 (gmt 0)

5+ Year Member


Thanks for the suggestion - a quick look at the log from the script confirms that it is trying to backup recursively.
As a GoDaddy customer, I do not have complete access to the sub-www directories of my site, however I will try to do what you suggested.
Also, in regards to your last statement that I should download these backups daily, is there any way I can automate this with some sort of application?

Thanks again!


2:39 am on May 25, 2010 (gmt 0)

5+ Year Member

man tar

Simply tell `tar` not to include the backup file:

cd /home
tar zcf /home/EXAMPLE/`date +%Y%b%d`_EXAMPLE_backup.tar.gz --exclude='*backup.tar.gz' EXAMPLE


Featured Threads

Hot Threads This Week

Hot Threads This Month