| 4:46 am on May 22, 2010 (gmt 0)|
The options provided to the tar command in this script are valid. Your problem is probably the directory where the backup is stored, not the backup options. The tar command makes a backup of $HOME/html/* which is the same directory tree where you also store the backup file. What happens is that the backup tries to store the resulting backup file in itself. That is where the backup probably fails and the tar command aborts, leaving an invalid gzip file.
Some other remarks about this script: The TERM=linux, export TERM and clear commands have only a valid meaning when the command is executed from a command line. In your situation where a cron file is used they have no function, and worst case they may interfere with the automated way the script executes.
Another advice is to add the --ignore-failed-read option to the tar command line. This causes the tar command to proceed execution when a read fault occurs (for example due to open files, files deleted by other processes while the backup is running etc.).
| 1:09 pm on May 22, 2010 (gmt 0)|
If my problem IS that it is trying to back up the backup itself, how can I exclude the folder where the backups are stored?
Thanks for the quick response.
| 2:14 pm on May 22, 2010 (gmt 0)|
The best way to solve the recursive backup problem of your backup files is to create a separate backup directory $HOME/backup and use that directory as the storage location for your files.
This will also solve the problem that with your current configuration everyone can download a complete backup of your site by just surfing with a browser to the /backups/dailyback/$NOWDATE.tar.gz path on your site.
Don't forget to move these backups on a daily basis to another server or your home computer, because a backup on the same drive/computer as the original files is practically spoken worthless.
| 2:29 pm on May 22, 2010 (gmt 0)|
Thanks for the suggestion - a quick look at the log from the script confirms that it is trying to backup recursively.
As a GoDaddy customer, I do not have complete access to the sub-www directories of my site, however I will try to do what you suggested.
Also, in regards to your last statement that I should download these backups daily, is there any way I can automate this with some sort of application?
| 2:39 am on May 25, 2010 (gmt 0)|
Simply tell `tar` not to include the backup file:
tar zcf /home/EXAMPLE/`date +%Y%b%d`_EXAMPLE_backup.tar.gz --exclude='*backup.tar.gz' EXAMPLE