3 research outputs found
Single-Letter Characterization of Epsilon-Capacity for Mixed Memoryless Channels
For the class of mixed channels decomposed into stationary memoryless
channels, single-letter characterizations of the -capacity have
not been known except for restricted classes of channels such as the regular
decomposable channel introduced by Winkelbauer. This paper gives single-letter
characterizations of -capacity for mixed channels decomposed into
at most countably many memoryless channels with a finite input alphabet and a
general output alphabet with/without cost constraints. It is shown that a given
characterization reduces to the one for the channel capacity given by Ahlswede
when is zero. In the proof of the coding theorem, the meta
converse bound, originally given by Polyanskiy, Poor and Verd\'u, is
particularized for the mixed channel decomposed into general component
channels.Comment: This is an extended version of the paper submitted to the 2014 IEEE
International Symposium on Information Theory (ISIT2014
Universal channel coding for general output alphabet
We propose two types of universal codes that are suited to two asymptotic
regimes when the output alphabet is possibly continuous. The first class has
the property that the error probability decays exponentially fast and we
identify an explicit lower bound on the error exponent. The other class attains
the epsilon-capacity the channel and we also identify the second-order term in
the asymptotic expansion. The proposed encoder is essentially based on the
packing lemma of the method of types. For the decoder, we first derive a
R\'enyi-relative-entropy version of Clarke and Barron's formula the distance
between the true distribution and the Bayesian mixture, which is of independent
interest. The universal decoder is stated in terms of this formula and
quantities used in the information spectrum method. The methods contained
herein allow us to analyze universal codes for channels with continuous and
discrete output alphabets in a unified manner, and to analyze their
performances in terms of the exponential decay of the error probability and the
second-order coding rate.Comment: Several typos are fixe
Shannon meets von Neumann: A Minimax Theorem for Channel Coding in the Presence of a Jammer
We study the setting of channel coding over a family of channels whose state
is controlled by an adversarial jammer by viewing it as a zero-sum game between
a finite blocklength encoder-decoder team, and the jammer. The encoder-decoder
team choose stochastic encoding and decoding strategies to minimize the average
probability of error in transmission, while the jammer chooses a distribution
on the state-space to maximize this probability. The min-max value of this game
is equivalent to channel coding for a compound channel -- we call this the
Shannon solution of the problem. The max-min value corresponds to finding a
mixed channel with the largest value of the minimum achievable probability of
error. When the min-max and max-min values are equal, the problem is said to
admit a saddle-point or von Neumann solution. While a Shannon solution always
exists, a von Neumann solution need not, owing to inherent nonconvexity in the
communicating team's problem. Despite this, we show that the min-max and
max-min values become equal asymptotically in the large blocklength limit, for
all but finitely many rates. We explicitly characterize this limiting value as
a function of the rate and obtain tight finite blocklength bounds on the
min-max and max-min value. As a corollary we get an explicit expression for the
-capacity of a compound channel under stochastic codes -- the first
such result, to the best of our knowledge. Our results demonstrate a deeper
relation between the compound channel and mixed channel than was previously
known. They also show that the conventional information-theoretic viewpoint,
articulated via the Shannon solution, coincides asymptotically with the
game-theoretic one articulated via the von Neumann solution.Comment: Submitted to the IEEE Transactions on Information Theor