I'm setting up a backup server that will be copying the backup set to a tar.gz file for daily, weekly, and monthly archives. But these archive files could realistically get in to the hundreds of gigs. Is there a way to have it create a series of, say, 600 meg files instead of one giant file?
Actually, I just realised that those switches are interactive, I guess it's not going to work too well for automated backup, but you might be able to do this with 'split'. You'll have to reassemble the pieces afterwards: tar czvf - source ¦ split -b 650m tarpiece-
Oddly, split is what I was looking at when I decided to come over here and check in on this thread :-)
What I'm going to try to do is split them into DVD sized chunks in case I can't reach the server via internet I can at least dump them to DVD. I was originally going to do CDs, but I created a tar of my backup set last night and it was 23g, which is way more CDs that I'd want to have to write.
Msg#: 1693 posted 11:44 pm on Mar 23, 2006 (gmt 0)
I should mention that some OSes do not like single files that are over 2GB to be present on ISO9660 disks. I discovered this while doing dumps of filesystems and trying to read them in FreeBSD. There are hacks to fix this issue on various platforms, but I thought you should know before you try that. Maybe split to 1.5GB rather than > 2GB.