Welcome to WebmasterWorld Guest from

Forum Moderators: bakedjake

Message Too Old, No Replies

tar / gzip and CD sized chunks



4:19 pm on Mar 22, 2006 (gmt 0)

10+ Year Member

I'm setting up a backup server that will be copying the backup set to a tar.gz file for daily, weekly, and monthly archives. But these archive files could realistically get in to the hundreds of gigs. Is there a way to have it create a series of, say, 600 meg files instead of one giant file?


5:33 pm on Mar 22, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

-M --tape-length=N
switches to tar.


1:38 pm on Mar 23, 2006 (gmt 0)

10+ Year Member

How does this work with the file name? Say I'm backing up to tuesday.tar, will the additional files become tuesday1.tar, tuesday2.tar, etc; or is it something different?

And just to make sure, say I want to break it up into 1gb chunks - it would be 1 x 1024 x 1024 x 1024 for the tape length, correct?


4:12 pm on Mar 23, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

Actually, I just realised that those switches are interactive, I guess it's not going to work too well for automated backup, but you might be able to do this with 'split'. You'll have to reassemble the pieces afterwards:

tar czvf - source  split -b 650m tarpiece-

...should create files tarpiece-aa tarpiece-ab tarpiece-ac...

To reassemble, copy to disk, then:

cat tarpiece-* > file.tar.gz

...will reassemble the tarfile.

P.S. Watch out for the pipe symbol in the first command.


4:42 pm on Mar 23, 2006 (gmt 0)

10+ Year Member

Oddly, split is what I was looking at when I decided to come over here and check in on this thread :-)

What I'm going to try to do is split them into DVD sized chunks in case I can't reach the server via internet I can at least dump them to DVD. I was originally going to do CDs, but I created a tar of my backup set last night and it was 23g, which is way more CDs that I'd want to have to write.


11:44 pm on Mar 23, 2006 (gmt 0)

10+ Year Member

I should mention that some OSes do not like single files that are over 2GB to be present on ISO9660 disks. I discovered this while doing dumps of filesystems and trying to read them in FreeBSD. There are hacks to fix this issue on various platforms, but I thought you should know before you try that. Maybe split to 1.5GB rather than > 2GB.



Featured Threads

Hot Threads This Week

Hot Threads This Month