Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: bakedjake
The reason I am asking is that lately I cannot unzip the directory.gz on windows no matter what program I use (tried 6 of them) At most I get one much larger file instead of unzipping in /folder/sub/file/html which it always did. I have a serious of scripts and use webmin to backup nightly, just in case, but I might have been fooled. I did not try this on my server...usually when I overwrite a config file for example, I unzip it here and upload it back. For server failure I have a second HD that should back up nightly.
Does anyoen have a better solution to backup entire directories? Uploading thm as they are is too time consuming.
My full command on the script I am using is:
tar cvf - $web_dir/* ¦ gzip >$log_dir/$current_day/so-www-hour-$datestamp.zip
where the variables are on top of the script. This way there is no overwriting as the date and folders are different. the folders and dates work fine, the content is the issue. I can unzip .gz files from other sites (i.e. downloaded programs) on my PC too.
Thanks in advance for any suggestions,
does this command seem right?
To test the generated file, in Linux, try:
gunzip -c directory.gz ¦ tar tvf -
And that should give you a list of files in the archive. This is my preferred method in Linux, though I don't have all that much experience with it in Windows. You might need to call it directory.tar.gz instead of directory.gz. That way, after it's decompressed in Windows, it'll be a tar file, which still needs to be extracted.
One more problem: I also need to exclude some folders for a more comprehensive backup. Bascially I need to back up /home minus a few (huge and redundant) folders. Sort of:
tar -cf --exclude "/home/cronolog"--exclude "/home/site2/backup" --exclude "/home/site3/backup" /home ¦ gzip -c > $dir/$day/all.tar.gz
I had it as tar -cf $dir/$day/all.tar.gz
--exclude "/home/cronolog"--exclude "/home/site2/backup" --exclude "/home/site3/backup" /home
and it worked but there was no gzip, making the archives too large. Also, using nano the line is too long and this resulted in problems (too many exclusions add up).
how do I do this?
Is this much more server intensive?
Thanks a lot for your help,
You can do more filtering in various ways, but the simplest is probably to replace the single egrep with a chain of them, eg:
tar cf - `find . -type f ¦ egrep -v dir/name1 ¦ egrep -v dir/name2 ...` ...
But if its getting that complicated I'd break this into two steps: selecting the files and then tar-ing them up. You tar probably has an option to archive a set of files listed in another file...
Unless you actually see a performance problem, don't worry about it. If you get into some sort of difficulties you could learn to combine the multiple egrep stages into one with a more complex regex filter expression, eg:
egrep -v '(dir/name1)¦(dir/name2)'
[edited by: DamonHD at 10:00 am (utc) on Nov. 25, 2007]