Re: Linux Kernel Source Compression

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Tue, 23 May 2006, Julian Seward wrote:

It uses an adaptive huffman scheme devised by David Wheeler, which usually
gets within 1% of the arithmetic coder that bzip1 used.

If that coder has patent issues, it shouldn't be used, of course, regardless of performance.

bzip2, especially the 1.0.X series, is superior to bzip1 in terms of speed,
memory use, robustness against bad-case inputs, recoverability of data
from damaged compressed streams, and that it can be used as a library.

Superior in most aspects, yes, but not regarding compression ratio. Anyway, calling bzip2 a step backwards was a bit of provocation and not really meant seriously, but it does have slightly reduced compression ratio.

Maybe bzip2 could be updated to make more use of today's fast CPUs? Much larger dictionary or other computationally expensive improvements.

Regards, Nuri
-
To unsubscribe from this list: send the line "unsubscribe linux-kernel" in
the body of a message to [email protected]
More majordomo info at  http://vger.kernel.org/majordomo-info.html
Please read the FAQ at  http://www.tux.org/lkml/

[Index of Archives]     [Kernel Newbies]     [Netfilter]     [Bugtraq]     [Photo]     [Stuff]     [Gimp]     [Yosemite News]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Video 4 Linux]     [Linux for the blind]     [Linux Resources]
  Powered by Linux