Welcome to WebmasterWorld Guest from 54.234.63.187

Forum Moderators: coopster & jatar k

Message Too Old, No Replies

Is this an acceptable way to backup a server?

     

dowzer

11:10 am on Sep 2, 2010 (gmt 0)

5+ Year Member



I have a CentOS Server which is running an application with a MySQL backend and PHP front end web. I need to take regular backups of the files and database (every 2 hours currently) but my host is trying to charge me a small fortune for a service to do so as they only offer 24 hour backups themselves and seem to think every 2 hours warrants charging a king's ransom.

Is is plausible that I could get a shared hosting account with another provider and script it so every 2 hours a database dump is made and copied to the new server along with all the PHP files into a folder which is created with a unique name for each backup (say, for example, the date and time stamp)?

When that is completed I would like the most recent x database dumps to be left on the main server for quick recovery and on the remote server y complete backups are kept for DR reasons and any older ones longer than the retention I want to maintain are deleted.

I appreciate that it will need to be scripted (which I can't do myself but I can find someone who can I am sure!) and if I then need to do a restore of some data from the database it will be a manual process but I am fine with that - is the concept plausible or is there a better/easier way of doing this?

Thanks

Jase

lammert

1:54 pm on Sep 2, 2010 (gmt 0)

WebmasterWorld Senior Member lammert is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



The concept looks OK. You have to find a shared server account though which allows so much data uploading per day. Often the small print in shared hosting contracts contains a list of not allowed uses of the shared hosting space. Alternatively you could look into specialized data storage providers like for example Amazon S3 which offer space specifically for upload and storage.

optik

2:43 pm on Sep 2, 2010 (gmt 0)

5+ Year Member



Why do you need to do such regular backups if you don't mind me asking.

I would hope you'd be doing longer term backups as well otherwise if you encounter a problem and don't spot it within your 2 hour time frame your back up will be useless.

dowzer

2:54 pm on Sep 2, 2010 (gmt 0)

5+ Year Member



Thanks Lammert - the database and files will not actually be that big initially so the bandwidth and storage requirements will be quite minimal. Even the entry level hosting package from my current host offers 150GB of bandwidth per month which allows me about 400MB of tranfer per backup session before I exceed that allowance.

dowzer

2:59 pm on Sep 2, 2010 (gmt 0)

5+ Year Member



Optik - the application is a paid cloud based application containing critical business data so we need to be able to offer regular backups in the event of hardware failure or database corruption etc. The actual SLA to customers has not yet been defined but it will never be less than 2 hours so if I know I can do it for 2 hours in this way then I am fine with it though, when it goes live, it may well be every 4 hours or 6 hours etc.

At the moment the retention period is not determined but, from the original post, I am planning to keep multiple copies/versions on the remote server so we can go back to a point in time beyond the most recent backup.
 

Featured Threads

Hot Threads This Week

Hot Threads This Month