research

Informational Divergence and Entropy Rate on Rooted Trees with Probabilities

Abstract

Rooted trees with probabilities are used to analyze properties of a variable length code. A bound is derived on the difference between the entropy rates of the code and a memoryless source. The bound is in terms of normalized informational divergence. The bound is used to derive converses for exact random number generation, resolution coding, and distribution matching.Comment: 5 pages. With proofs and illustrating exampl

    Similar works

    Full text

    thumbnail-image

    Available Versions