Forum Moderators: phranque

Message Too Old, No Replies

Server Backup Software

Need to backup a directory daily

         

Captaffy

12:36 am on May 9, 2006 (gmt 0)

10+ Year Member



I have a web site which is sorely lacking an automatic backup solution. The database is being backed up automatically, but there is a directory of images and other media that I would like to have downloaded to one of my home machines daily. Something like an automatic ftp script, that only downloads changed files.

Any suggestions?

MichaelBluejay

9:05 am on May 9, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Use at your own risk, but I think this will work....

Assuming your server is a UNIX box, this command would work on the shell prompt:

cp -R /home/user/websitefolder /home/user/backup

Now, the trick is getting that to run automatically. To do that you set up a "cronjob". Ask your webhost how to do that.

Romeo

9:43 am on May 9, 2006 (gmt 0)

10+ Year Member



you may look at 'rsync', which can perfectly do want you want, especially in getting new files only on subsequent runs.
"rsync is an open source utility that provides fast incremental file transfer."
It needs a running daemon on the server side though.

Regards,
R.

trillianjedi

9:43 am on May 9, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I think you can probably use TAR with the --update switch which might be more efficient than CP if you want to FTP to another server.

You need to check this, but I think in essence it would be something like:-


#!/bin/sh

HOST='ftpmybackupserver.net'
USER='myUserid'
PASSWD='mypassword'

#Refresh our backup file
tar -cfu backup.tar <directory to backup>

#Optional : zip it up for later FTP-ing
gzip backup.tar

ftp -n $HOST <<END_SCRIPT
quote USER $USER
quote PASS $PASSWD
put backup.tar.gz
quit
END_SCRIPT
exit 0

Call something like that from CRON as mentioned above.

You might then want another script on the FTP server receiving the files to rotate them, just in case you end up with a defective file for some reason (so you have several).

TJ

Mr Bo Jangles

9:44 am on May 9, 2006 (gmt 0)

10+ Year Member



For a couple of years I used to use Second Copy & WebDrive to backup my web server to my home computer - it all got too much. Then, I was pointed to a very economical solution - bqbackup

[no affiliations, except a happy user]

txbakers

11:30 am on May 11, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I use a program called PeerSync to run directory backups every two hours. It's a W server. I've been happy with it.