207 research outputs found
Informational Divergence Approximations to Product Distributions
The minimum rate needed to accurately approximate a product distribution
based on an unnormalized informational divergence is shown to be a mutual
information. This result subsumes results of Wyner on common information and
Han-Verd\'{u} on resolvability. The result also extends to cases where the
source distribution is unknown but the entropy is known
Fixed-to-Variable Length Distribution Matching
Fixed-to-variable length (f2v) matchers are used to reversibly transform an
input sequence of independent and uniformly distributed bits into an output
sequence of bits that are (approximately) independent and distributed according
to a target distribution. The degree of approximation is measured by the
informational divergence between the output distribution and the target
distribution. An algorithm is developed that efficiently finds optimal f2v
codes. It is shown that by encoding the input bits blockwise, the informational
divergence per bit approaches zero as the block length approaches infinity. A
relation to data compression by Tunstall coding is established.Comment: 5 pages, essentially the ISIT 2013 versio
Informational Divergence and Entropy Rate on Rooted Trees with Probabilities
Rooted trees with probabilities are used to analyze properties of a variable
length code. A bound is derived on the difference between the entropy rates of
the code and a memoryless source. The bound is in terms of normalized
informational divergence. The bound is used to derive converses for exact
random number generation, resolution coding, and distribution matching.Comment: 5 pages. With proofs and illustrating exampl
Greedy Algorithms for Optimal Distribution Approximation
The approximation of a discrete probability distribution by an
-type distribution is considered. The approximation error is
measured by the informational divergence
, which is an appropriate measure, e.g.,
in the context of data compression. Properties of the optimal approximation are
derived and bounds on the approximation error are presented, which are
asymptotically tight. It is shown that -type approximations that minimize
either , or
, or the variational distance
can all be found by using specific
instances of the same general greedy algorithm.Comment: 5 page
- …