24 research outputs found
Channel Detection in Coded Communication
We consider the problem of block-coded communication, where in each block,
the channel law belongs to one of two disjoint sets. The decoder is aimed to
decode only messages that have undergone a channel from one of the sets, and
thus has to detect the set which contains the prevailing channel. We begin with
the simplified case where each of the sets is a singleton. For any given code,
we derive the optimum detection/decoding rule in the sense of the best
trade-off among the probabilities of decoding error, false alarm, and
misdetection, and also introduce sub-optimal detection/decoding rules which are
simpler to implement. Then, various achievable bounds on the error exponents
are derived, including the exact single-letter characterization of the random
coding exponents for the optimal detector/decoder. We then extend the random
coding analysis to general sets of channels, and show that there exists a
universal detector/decoder which performs asymptotically as well as the optimal
detector/decoder, when tuned to detect a channel from a specific pair of
channels. The case of a pair of binary symmetric channels is discussed in
detail.Comment: Submitted to IEEE Transactions on Information Theor
Asynchronous Communication: Exact Synchronization, Universality, and Dispersion
Recently, Tchamkerten and coworkers proposed a novel variation of the problem of joint synchronization and error correction. This paper considers a strengthened formulation that requires the decoder to estimate both the message and the location of the codeword exactly. Such a scheme allows for transmitting data bits in the synchronization phase of the communication, thereby improving bandwidth and energy efficiencies. It is shown that the capacity region remains unchanged under the exact synchronization requirement. Furthermore, asynchronous capacity can be achieved by universal (channel independent) codes. Comparisons with earlier results on another (delay compensated) definition of rate are made. The finite blocklength regime is investigated and it is demonstrated that even for moderate blocklengths, it is possible to construct capacity-achieving codes that tolerate exponential level of asynchronism and experience only a rather small loss in rate compared to the perfectly synchronized setting; in particular, the channel dispersion does not suffer any degradation due to asynchronism. For the binary symmetric channel, a translation (coset) of a good linear code is shown to achieve the capacity-synchronization tradeoff.National Science Foundation (U.S.) (Center for Science of Information Grant CCF-0939370
On the Reliability Function of Distributed Hypothesis Testing Under Optimal Detection
The distributed hypothesis testing problem with full side-information is
studied. The trade-off (reliability function) between the two types of error
exponents under limited rate is studied in the following way. First, the
problem is reduced to the problem of determining the reliability function of
channel codes designed for detection (in analogy to a similar result which
connects the reliability function of distributed lossless compression and
ordinary channel codes). Second, a single-letter random-coding bound based on a
hierarchical ensemble, as well as a single-letter expurgated bound, are derived
for the reliability of channel-detection codes. Both bounds are derived for a
system which employs the optimal detection rule. We conjecture that the
resulting random-coding bound is ensemble-tight, and consequently optimal
within the class of quantization-and-binning schemes
Nonasymptotic noisy lossy source coding
This paper shows new general nonasymptotic achievability and converse bounds
and performs their dispersion analysis for the lossy compression problem in
which the compressor observes the source through a noisy channel. While this
problem is asymptotically equivalent to a noiseless lossy source coding problem
with a modified distortion function, nonasymptotically there is a noticeable
gap in how fast their minimum achievable coding rates approach the common
rate-distortion function, as evidenced both by the refined asymptotic analysis
(dispersion) and the numerical results. The size of the gap between the
dispersions of the noisy problem and the asymptotically equivalent noiseless
problem depends on the stochastic variability of the channel through which the
compressor observes the source.Comment: IEEE Transactions on Information Theory, 201
Random Access Channel Coding in the Finite Blocklength Regime
Consider a random access communication scenario over a channel whose
operation is defined for any number of possible transmitters. Inspired by the
model recently introduced by Polyanskiy for the Multiple Access Channel (MAC)
with a fixed, known number of transmitters, we assume that the channel is
invariant to permutations on its inputs, and that all active transmitters
employ identical encoders. Unlike Polyanskiy, we consider a scenario where
neither the transmitters nor the receiver know which transmitters are active.
We refer to this agnostic communication setup as the Random Access Channel, or
RAC. Scheduled feedback of a finite number of bits is used to synchronize the
transmitters. The decoder is tasked with determining from the channel output
the number of active transmitters () and their messages but not which
transmitter sent which message. The decoding procedure occurs at a time
depending on the decoder's estimate of the number of active transmitters,
, thereby achieving a rate that varies with the number of active
transmitters. Single-bit feedback at each time , enables all
transmitters to determine the end of one coding epoch and the start of the
next. The central result of this work demonstrates the achievability on a RAC
of performance that is first-order optimal for the MAC in operation during each
coding epoch. While prior multiple access schemes for a fixed number of
transmitters require simultaneous threshold rules, the proposed
scheme uses a single threshold rule and achieves the same dispersion.Comment: Presented at ISIT18', submitted to IEEE Transactions on Information
Theor
Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities
This monograph presents a unified treatment of single- and multi-user
problems in Shannon's information theory where we depart from the requirement
that the error probability decays asymptotically in the blocklength. Instead,
the error probabilities for various problems are bounded above by a
non-vanishing constant and the spotlight is shone on achievable coding rates as
functions of the growing blocklengths. This represents the study of asymptotic
estimates with non-vanishing error probabilities.
In Part I, after reviewing the fundamentals of information theory, we discuss
Strassen's seminal result for binary hypothesis testing where the type-I error
probability is non-vanishing and the rate of decay of the type-II error
probability with growing number of independent observations is characterized.
In Part II, we use this basic hypothesis testing result to develop second- and
sometimes, even third-order asymptotic expansions for point-to-point
communication. Finally in Part III, we consider network information theory
problems for which the second-order asymptotics are known. These problems
include some classes of channels with random state, the multiple-encoder
distributed lossless source coding (Slepian-Wolf) problem and special cases of
the Gaussian interference and multiple-access channels. Finally, we discuss
avenues for further research.Comment: Further comments welcom