Les Mikesell wrote:
Mike McCarty wrote:
I have a backup script which I run on some sort of regular
basis. I use tar to create an archive, which I then split
into pieces of CDROM size (703MB) and write to CDROMs.
[snip]
Is there a way to get tar to use the archive it is adding to
"in place"? I've read man and info, and I see --append
(which is what I was using) and --catenate (which looks
marginally faster, perhaps, since I compress), but see
no way to make it "just do it" without doing an implicit
copy.
You can't append to an already gzipped file, so it must be copying the
previous section by uncompressing from the start and recompressing a new
copy so it can continue with the compressor in the right state. Have
you tried not using -z with tar while creating the archive, then piping
through gzip and split at the end?
I have tried just using cat to put them together, then telling
tar -i (ignore zeroes) and can successfully get a TOC back out.
Further, the man page for gzip states:
[QUOTE MODE ON]
Multiple compressed files can be concatenated. In this case, gunzip
will extract all members at once. For example:
gzip -c file1 > foo.gz
gzip -c file2 >> foo.gz
Then
gunzip -c foo
is equivalent to
cat file1 file2
[QUOTE MODE OFF]
So, I trow you are not quite correct on that point. I suspect
that sth like
for path in $backupdirs
do
tar cvz $badkupdir >> backup.tgz
...
will work, but need the -i for recovery. This is not quite
what I would like, of course. Also, I'm not quite clear
whether usig >> in this wise obviates the copy.
Mike
--
p="p=%c%s%c;main(){printf(p,34,p,34);}";main(){printf(p,34,p,34);}
Oppose globalization and One World Governments like the UN.
This message made from 100% recycled bits.
You have found the bank of Larn.
I can explain it for you, but I can't understand it for you.
I speak only for myself, and I am unanimous in that!