| 8:27 pm on Nov 23, 2007 (gmt 0)|
|does this command seem right? |
tar cvf - directory ¦ gzip -c > directory.gz
To test the generated file, in Linux, try:
gunzip -c directory.gz ¦ tar tvf -
And that should give you a list of files in the archive. This is my preferred method in Linux, though I don't have all that much experience with it in Windows. You might need to call it directory.tar.gz instead of directory.gz. That way, after it's decompressed in Windows, it'll be a tar file, which still needs to be extracted.
| 10:37 pm on Nov 23, 2007 (gmt 0)|
You want tar cf - directory ¦ gzip -9 > directory.tar.gz
1) Don't use the v option on tar else the verbose output will/may get mixed up with the tar output and corrupt the archive.
2) Use the -9 option on gzip if you want decent compression, which is probably good for backups.
| 11:16 pm on Nov 23, 2007 (gmt 0)|
thanks! Prob solved. Not sure why it stopped working but ...
One more problem: I also need to exclude some folders for a more comprehensive backup. Bascially I need to back up /home minus a few (huge and redundant) folders. Sort of:
tar -cf --exclude "/home/cronolog"--exclude "/home/site2/backup" --exclude "/home/site3/backup" /home ¦ gzip -c > $dir/$day/all.tar.gz
I had it as tar -cf $dir/$day/all.tar.gz
--exclude "/home/cronolog"--exclude "/home/site2/backup" --exclude "/home/site3/backup" /home
and it worked but there was no gzip, making the archives too large. Also, using nano the line is too long and this resulted in problems (too many exclusions add up).
how do I do this?
| 8:48 pm on Nov 24, 2007 (gmt 0)|
I would be more inclined to use 'find' to collect the files, filter things out with its logic or use grep, then pass the file list to tar.
One way of doing this might be:
tar cf - `find . -type f ¦ egrep -v dir/name1` ¦ gzip -9 > directory.tar.gz
| 10:31 pm on Nov 24, 2007 (gmt 0)|
>> tar cf - `find . -type f ¦ egrep -v dir/name1` ¦ gzip -9 > directory.tar.gz
Thank you! That is good as well.
Two questions: is the "-type f" to exclude? If so,
how do I add more directories to exclude? Say I want to exclude dir1 dir2 dir3.
Is this much more server intensive?
Thanks a lot for your help,
| 9:58 am on Nov 25, 2007 (gmt 0)|
The -type f includes only plain files since you don't usually need the directories entries in your tar file, and this stops tar recursing on those directories giving duplicate copies.
You can do more filtering in various ways, but the simplest is probably to replace the single egrep with a chain of them, eg:
tar cf - `find . -type f ¦ egrep -v dir/name1 ¦ egrep -v dir/name2 ...` ...
But if its getting that complicated I'd break this into two steps: selecting the files and then tar-ing them up. You tar probably has an option to archive a set of files listed in another file...
Unless you actually see a performance problem, don't worry about it. If you get into some sort of difficulties you could learn to combine the multiple egrep stages into one with a more complex regex filter expression, eg:
egrep -v '(dir/name1)¦(dir/name2)'
[edited by: DamonHD at 10:00 am (utc) on Nov. 25, 2007]
| 12:49 pm on Nov 25, 2007 (gmt 0)|
the type -f will also exclude symbolic links which can be useful for tar.
if the multi pipe/egrep thing doesn't do it for you, you can try the -prune option for the find command, but i'm not familiar enough with the syntax to "fix" your command to use it.