Welcome to WebmasterWorld Guest from

Forum Moderators: ergophobe

Message Too Old, No Replies

Version backups for joomla?

How to restore an oops?

1:00 am on Nov 11, 2011 (gmt 0)

Preferred Member from US 

10+ Year Member

joined:May 6, 2004
posts: 650
votes: 0

I have a joomla 1.7 site. I was fiddling with one of the pages and lost part of the content in an article.

I can fix this by simply enough. However, it got me to thinking about the depth of my backups.

I am running Akeeba backup now, the latest version. I have a backup prior to the change. However, I don't want to restore the entire site. Right now, I am in the process of extracting the downloaded version of the site. I undertand that there is a way to get the content for one article that way.

My questions.

1. Are there any tools that can take periodic snapshots of the site? I did see some version extensions and may try them. I was also thinking about the possibility of a site downloader since this site is less than 150 pages.

2. Has anyone done what I am attempting with Joomla/akeeba? From the documentation I saw, it said it would take 'ten minutes'. I did set up php and msql for my local windows machine. The idea is that you download the backup file, extract it, then get it running on your local machine, find the article, copy and paste to your live site. .. THAT WILL TEACH ME NOT TO SCREW UP.

As I think about it, I'm leaning toward running akeeba to recover from a total crash or irrecoverable hack but to also run a site downloader to make periodic snapshots of the content as well as running some type of version tool. (ie belt, suspenders and safey pins. )

/rambling guy


12:20 am on Nov 12, 2011 (gmt 0)

Moderator from US 

WebmasterWorld Administrator travelin_cat is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Feb 28, 2004
posts: 3222
votes: 12

Chris, once a week I download a copy of the database via the cpanel. I keep these for a year. I have never trusted Akeeba as some of the backups I received were corrupted. I also download a complete copy of the home directory of the site. On one of my larger sites, both of these actions took less then 10 minutes to complete.

You can access pretty much any article from the db if you know what you are doing.
10:18 pm on Nov 13, 2011 (gmt 0)

Moderator This Forum

WebmasterWorld Administrator ergophobe is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 25, 2002
votes: 222

I've mentioned this lots of times before, but here's my poorman's backup for a small site database. If you get over a few thousand pages, this might not work, but it's crazy simple.

1. Build site.
2. Get GMail account sitename@gmail.com.
3. write script that runs a MySQL dump, gzips it, sends it to said GMail account with the date and time as the subject header.
4. Set up a cron job to run as often as you need it, depending on how active the site is.

Voila, 7.647 GB of free backup space, all arranged by date. A complete archive. If you have a few thousand pages and the attachment is, let's choose a random number like 7.647MB, you have 1000 backups you can keep. At 2x per day, that's 500 days. Assuming you're also downloading the occasional snapshot of the site somewhere else, once a year you just go in there and delete all the backups.

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members