Citation
Abstract
In this article, the authors consider the following question about Huffman coding, which is an important technique for compressing data from a discrete source. If p is the smallest source probability, how long, in terms of p, can the longest Huffman codeword be? It is shown that if p is in the range 0 < p < 1/2, and if K is the number, then the longest Huffman codeword for a source whose least probability is p is at most K, and no better bound is possible. Asymptotically, this implies the surprising fact that for small values of p, a Huffman code’s longest codeword can be as much as 44 percent larger than that of the corresponding Shannon code.
Details
- Volume
- 42-110
- Published
- August 15, 1992
- Pages
- 188–193
- File Size
- 246.4 KB