Forum Moderators: mack
I can't even figure out how to backup all sites at once using ensim because it requires a ftp server and location. Also, when I upgraded to the latest phpmyadmin and mysql, it won't let me back up a database, it just hangs forever now.
So is there a script to install that will backup everything and send it to me?
The reason I got my own server is because every host I used had major major problems. So I'm having to learn how to manage one myself without any prior knowledge whatsoever.
I've found a backup.sh script for backing up all sites and db's on a server but there are 24 pages of help threads where I found it so people are having problems with it.
I have a backup script which will backup any file/folders you want (you specify) and any mysql dbase you want to. It then tars it all into a .tar.gz file.
You set it to run daily using cron, but can set an offset. I have mine set at 3 so it runs at 3am.
You would then need a way of getting this off your server, for safe storage.
Will this help you?
wruk999
First, here is the script, call it backup.pl and put it in a folder outside the root of your sites.
backup.pl
================
#!/usr/bin/perl
#### CONFIG ####
$BackupHour = 3; # TIME TO RUN THE BACKUP
$BackupPath = "/backup"; #This is the folder where your backup file set file is stored
$BackupFileName = "backup.tar.gz"; #The name you want to give to your tar file
$BackupFileList = "file_list.txt"; #The list that you use to determine what parts to back up
$HostHasMySQL = 0; # 1 = YES, 0 = NO
$MySQLRootPassword = '';
$MySQLBackupFileList = "mysql_list.txt"; #The list that you use to determine which databases to backup.
################
($sec,$min,$hour,$mday,$mon,$year,$wday,$yday,$isdst) = localtime(time);
if ($hour == $BackupHour) {
if ($HostHasMySQL == 1) {
&DumpDatabases;
}
&BackupFiles;
}
#### SUBS ####
sub BackupFiles
{
# remove old backup file
$systemresult = system("rm $BackupPath\/$BackupFileName");
# create .tar.gz file
$systemresult = system("tar czvf $BackupPath\/$BackupFileName -T $BackupPath\/$BackupFileList \&>\/dev\/null");
}
sub DumpDatabases
{
my $name;
if (open(DBLIST, "$BackupPath\/$MySQLBackupFileList")) {
while (<DBLIST>) {
if (/^\s*\#/ ¦¦ /^\s*$/) {
# Ignore Comments and Blank Lines
} elsif ( /^\s*([\w]+)\s*/ ) {
$name = $1;
$systemresult = system("mysqldump --opt --user=root --password=$MySQLRootPassword $name > $BackupPath$
}
}
close (DBLIST);
}
}
Create a file, file_list.txt, and insert folders that you want backing up, into a new line.
eg:
/www/home
/www/public
/etc/passwd
The bottom one, shows that you can also back individual files up. If you put a folder in the list, it will backup the contents of that folder, and all subfolders.
For the mysql_list.txt - just enter the dabase names onto separate lines.
eg:
site1dbase
site2dbase
mydbase
Set it up to run either on cron, or run it from the cammand line.
A tar.gz file will be created - if you want to get this off, then you will need to download it off your server - but beware, if all your sites and databases are in it, it could be a large file!
Hope this helps,
wruk999