Publication
ISIT 1993
Conference paper

Huffman algebras for independent random variables

Abstract

The Huffman algorithm was originally devised to construct the minimal expected length instantaneous source code for a random variable. In this paper, we identify the Hardy-Littlewood-Polya inequality as the key step in the proof of the optimality of the Huffman algorithm and use this to provide an unified framework for various applications of the Huffman algorithm. Quantities that are analogous to entropy are identified for these applications. We also consider the case when the weights of the leaves of the tree are independent random variables. A Huffman algebra is defined, which provides conditions under which the Huffman algorithm is optimal for the case of random weights. In particular, it is shown that the 'most balanced' tree is optimal for the case of independent and identically distributed weights for any arrangement increasing Huffman algebra.

Date

Publication

ISIT 1993

Authors

Share