Msg#: 4136908 posted 2:49 am on May 22, 2010 (gmt 0)
I recently found this script on a website - it backs up all files on my GoDaddy website. I run it with Cron daily - works wonderfully. The problem is, the tar.gz files produced by the script are corrupt. Is there any obvious reason for this, or any way to fix it? Or perhaps the script can be modified to simply copy the files, and not gzip them.
Thanks in advance for any help - here's the shell script: #!/bin/bash TERM=linux export TERM NOWDATE=`date +%m%d%y` # Sets the date variable format for zipped file: MMddyy clear # clears terminal window echo echo "Hi, $USER!" echo echo "Beginning backup of files @ `date`" echo echo "Zipping directory structure..." tar -cvzf $HOME/html/backups/dailyback/$NOWDATE.tar.gz $HOME/html/* echo "Backup Complete!"
Msg#: 4136908 posted 4:46 am on May 22, 2010 (gmt 0)
The options provided to the tar command in this script are valid. Your problem is probably the directory where the backup is stored, not the backup options. The tar command makes a backup of $HOME/html/* which is the same directory tree where you also store the backup file. What happens is that the backup tries to store the resulting backup file in itself. That is where the backup probably fails and the tar command aborts, leaving an invalid gzip file.
Some other remarks about this script: The TERM=linux, export TERM and clear commands have only a valid meaning when the command is executed from a command line. In your situation where a cron file is used they have no function, and worst case they may interfere with the automated way the script executes.
Another advice is to add the --ignore-failed-read option to the tar command line. This causes the tar command to proceed execution when a read fault occurs (for example due to open files, files deleted by other processes while the backup is running etc.).
Msg#: 4136908 posted 2:14 pm on May 22, 2010 (gmt 0)
The best way to solve the recursive backup problem of your backup files is to create a separate backup directory $HOME/backup and use that directory as the storage location for your files.
This will also solve the problem that with your current configuration everyone can download a complete backup of your site by just surfing with a browser to the /backups/dailyback/$NOWDATE.tar.gz path on your site.
Don't forget to move these backups on a daily basis to another server or your home computer, because a backup on the same drive/computer as the original files is practically spoken worthless.
Msg#: 4136908 posted 2:29 pm on May 22, 2010 (gmt 0)
Thanks for the suggestion - a quick look at the log from the script confirms that it is trying to backup recursively. As a GoDaddy customer, I do not have complete access to the sub-www directories of my site, however I will try to do what you suggested. Also, in regards to your last statement that I should download these backups daily, is there any way I can automate this with some sort of application?