On Wed, 22 Jun 2005, sharif islam wrote:
Is there any compression mechanism that's rated for tar files as big as, say, 100GB?
gzip and bzip2 both handle large files. What is your evaluation criteria?
Are you CPU bound (IOW how much time/power it takes to compress is your most important criteria and how much it compresses doesn't matter so much)? gzip is good for that (much faster than bzip2).
Are you space bound (IOW what is most important is how much it compresses - even if it takes longer/more power)? bzip2 is good for that (it often compresses as much as twice as well as gzip or zip). If you have multiple processors there is even a version that can exploit several processors to accelerate compression (Parallel BZIP2 <URL:http://compression.ca/pbzip2/>).
Does the 'tar file' contain a lot of 'precompressed' data (zip/gz/mpg/avi/wmf/mp3/jpg/gif/png etc)? If so you may not get a lot of (or any) additional compression from _any_ general purpose compression program.
You haven't specified your problem well enough to get good recommendations.
-- Benjamin Franz
Simple things should be simple, complex things should be possible. - Alan Kay