I have a CentOS Server which is running an application with a MySQL backend and PHP front end web. I need to take regular backups of the files and database (every 2 hours currently) but my host is trying to charge me a small fortune for a service to do so as they only offer 24 hour backups themselves and seem to think every 2 hours warrants charging a king's ransom.
Is is plausible that I could get a shared hosting account with another provider and script it so every 2 hours a database dump is made and copied to the new server along with all the PHP files into a folder which is created with a unique name for each backup (say, for example, the date and time stamp)?
When that is completed I would like the most recent x database dumps to be left on the main server for quick recovery and on the remote server y complete backups are kept for DR reasons and any older ones longer than the retention I want to maintain are deleted.
I appreciate that it will need to be scripted (which I can't do myself but I can find someone who can I am sure!) and if I then need to do a restore of some data from the database it will be a manual process but I am fine with that - is the concept plausible or is there a better/easier way of doing this?