Forum Moderators: phranque
I know that all serious people hanging around forums like this one already have something in place, and I wonder about how they do it.
What's the easiest way to automate copy procedure to your local machine, or even right on the server so you can manually sort it out later?
Thanks
Many hosts have cron job in place which offers a five-day rotating logs. Other hosts traditionally zipped the daily logs for an entire months communication.
From your explantion, I'm assuming you don't see accumulation beyond a day?
If you want more extensive logs, try communicating with host or find another host. Raw logs are one of my main priorities. My current cheapo host provided in advance of my paying for a year that they provided rotating logs and the one-day rotating log in no way resembles rotating logs.
I have some where, by default, I still have logs from year ago, while with other it's about week or two. Both are VPSs.
I was curious about what people do in their specific cases.
Thanks
P.S.
I do use shared hosting too, but in my case, everything goes into one log which is really messy and I'm not sure if I could even read anything properly.
The file names are dynamically generated. Each day of the week a file X-myserver-var-log.tgz file is created for the log file where X runs from 1 to 6. On each Sunday the name of the backup file is changed to YYYY-MM-DD-myserver-var-log.tgz. with YYYY-MM-DD the current date.
In this way I have full daily backups of the last seven days, and full weekly backups for older data.
Because this task is run from cron, an email is generated with all output of the commands. So each morning the system sends me an email with the results of archiving, result of the FTP push and a list of the current files on the backup server.
File: /usr/local/etc/allback
# Set some general options.BACKSER='myserver'
BACKDAY=`/bin/date +%w`
BACKDIR=/tmp/backup-dir.$$
BASEOPT="--create --ignore-failed-read --totals --gzip"
if [ "$BACKDAY" = "0" ] ; then
BACKDAY=`/bin/date +%Y-%m-%d`
fi
# The best way to backup the MySQL databases is to dump the
# contents to an ASCII dump file. We then know that the contents
# are consistent and it is also a sort of check that the database
# is still working fine.
echo ""
echo "Dumping MySQL databases to regular file"
/usr/bin/mysqldump --all-databases --quote-names --opt > /var/www/sqldumps/all_databases.sql
# We want to include the complete path( without trailing /) in
# the tar archives. We therefore have to make sure that we are
# located in the root directory before starting the tar calls.
/bin/mkdir $BACKDIR
cd /
# The tar archives are created. Gzip compression is used. We
# could use bzip2 compression but the results are only slightly
# better and test showed that the backup process may take five
# to ten times more time. During backup the server has a higher
# load than normal which may cause delays in HTTP requests. We
# therefore like to keep the high load period as short as possible.
echo ""
echo "Creating backup of /var/www"
BACKFILE=$BACKDIR/$BACKDAY-$BACKSER-var-www.tgz
/bin/tar $BASEOPT --file $BACKFILE var/www
/bin/gzip --test --verbose $BACKFILE
echo ""
echo "Creating backup of /var/log"
BACKFILE=$BACKDIR/$BACKDAY-$BACKSER-var-log.tgz
/bin/tar $BASEOPT --file $BACKFILE var/log
/bin/gzip --test --verbose $BACKFILE
# Now do the FTP action. The FTP does a dir and quota listing
# which should be visible in the resulting email log. This makes
# it easy to test if the backup succeeded.
cd $BACKDIR
/usr/bin/ftp -v -n -i ftpback < /usr/local/etc/allback.ftp
cd /
#
# We don't need our temporary backup directory anymore. We
# delete it to preserve diskspace.
#
/bin/rm -rf $BACKDIR
File: /usr/local/etc/allback.ftp
user username passwordbin
mput *.tgz
dir
site quota
quit