74,964 research outputs found
Minimum Rates of Approximate Sufficient Statistics
Given a sufficient statistic for a parametric family of distributions, one
can estimate the parameter without access to the data. However, the memory or
code size for storing the sufficient statistic may nonetheless still be
prohibitive. Indeed, for independent samples drawn from a -nomial
distribution with degrees of freedom, the length of the code scales as
. In many applications, we may not have a useful notion of
sufficient statistics (e.g., when the parametric family is not an exponential
family) and we also may not need to reconstruct the generating distribution
exactly. By adopting a Shannon-theoretic approach in which we allow a small
error in estimating the generating distribution, we construct various {\em
approximate sufficient statistics} and show that the code length can be reduced
to . We consider errors measured according to the
relative entropy and variational distance criteria. For the code constructions,
we leverage Rissanen's minimum description length principle, which yields a
non-vanishing error measured according to the relative entropy. For the
converse parts, we use Clarke and Barron's formula for the relative entropy of
a parametrized distribution and the corresponding mixture distribution.
However, this method only yields a weak converse for the variational distance.
We develop new techniques to achieve vanishing errors and we also prove strong
converses. The latter means that even if the code is allowed to have a
non-vanishing error, its length must still be at least .Comment: To appear in the IEEE Transactions on Information Theor
Second-Order Coding Rates for Channels with State
We study the performance limits of state-dependent discrete memoryless
channels with a discrete state available at both the encoder and the decoder.
We establish the epsilon-capacity as well as necessary and sufficient
conditions for the strong converse property for such channels when the sequence
of channel states is not necessarily stationary, memoryless or ergodic. We then
seek a finer characterization of these capacities in terms of second-order
coding rates. The general results are supplemented by several examples
including i.i.d. and Markov states and mixed channels
The Third-Order Term in the Normal Approximation for the AWGN Channel
This paper shows that, under the average error probability formalism, the
third-order term in the normal approximation for the additive white Gaussian
noise channel with a maximal or equal power constraint is at least . This matches the upper bound derived by
Polyanskiy-Poor-Verd\'{u} (2010).Comment: 13 pages, 1 figur
- …
