4 research outputs found
The Sphere Packing Bound For Memoryless Channels
Sphere packing bounds (SPBs) ---with prefactors that are polynomial in the
block length--- are derived for codes on two families of memoryless channels
using Augustin's method: (possibly non-stationary) memoryless channels with
(possibly multiple) additive cost constraints and stationary memoryless
channels with convex constraints on the composition (i.e. empirical
distribution, type) of the input codewords. A variant of Gallager's bound is
derived in order to show that these sphere packing bounds are tight in terms of
the exponential decay rate of the error probability with the block length under
mild hypotheses.Comment: 29 page
The Augustin Capacity and Center
For any channel, the existence of a unique Augustin mean is established for
any positive order and probability mass function on the input set. The Augustin
mean is shown to be the unique fixed point of an operator defined in terms of
the order and the input distribution. The Augustin information is shown to be
continuously differentiable in the order. For any channel and convex constraint
set with finite Augustin capacity, the existence of a unique Augustin center
and the associated van Erven-Harremoes bound are established. The
Augustin-Legendre (A-L) information, capacity, center, and radius are
introduced and the latter three are proved to be equal to the corresponding
Renyi-Gallager quantities. The equality of the A-L capacity to the A-L radius
for arbitrary channels and the existence of a unique A-L center for channels
with finite A-L capacity are established. For all interior points of the
feasible set of cost constraints, the cost constrained Augustin capacity and
center are expressed in terms of the A-L capacity and center. Certain shift
invariant families of probabilities and certain Gaussian channels are analyzed
as examples.Comment: 59 page
A derivation of the cost-constrained sphere-packing exponent
We derive the channel-coding sphere-packing exponent under a per-codeword cost constraint. The proof is based on hypothesis testing and holds for continuous memoryless channels
Divergence Measures
Data science, information theory, probability theory, statistical learning and other related disciplines greatly benefit from non-negative measures of dissimilarity between pairs of probability measures. These are known as divergence measures, and exploring their mathematical foundations and diverse applications is of significant interest. The present Special Issue, entitled “Divergence Measures: Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems”, includes eight original contributions, and it is focused on the study of the mathematical properties and applications of classical and generalized divergence measures from an information-theoretic perspective. It mainly deals with two key generalizations of the relative entropy: namely, the R_ényi divergence and the important class of f -divergences. It is our hope that the readers will find interest in this Special Issue, which will stimulate further research in the study of the mathematical foundations and applications of divergence measures