3,403 research outputs found
Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities
This monograph presents a unified treatment of single- and multi-user
problems in Shannon's information theory where we depart from the requirement
that the error probability decays asymptotically in the blocklength. Instead,
the error probabilities for various problems are bounded above by a
non-vanishing constant and the spotlight is shone on achievable coding rates as
functions of the growing blocklengths. This represents the study of asymptotic
estimates with non-vanishing error probabilities.
In Part I, after reviewing the fundamentals of information theory, we discuss
Strassen's seminal result for binary hypothesis testing where the type-I error
probability is non-vanishing and the rate of decay of the type-II error
probability with growing number of independent observations is characterized.
In Part II, we use this basic hypothesis testing result to develop second- and
sometimes, even third-order asymptotic expansions for point-to-point
communication. Finally in Part III, we consider network information theory
problems for which the second-order asymptotics are known. These problems
include some classes of channels with random state, the multiple-encoder
distributed lossless source coding (Slepian-Wolf) problem and special cases of
the Gaussian interference and multiple-access channels. Finally, we discuss
avenues for further research.Comment: Further comments welcom
Controlled Sensing for Multihypothesis Testing
The problem of multiple hypothesis testing with observation control is
considered in both fixed sample size and sequential settings. In the fixed
sample size setting, for binary hypothesis testing, the optimal exponent for
the maximal error probability corresponds to the maximum Chernoff information
over the choice of controls, and a pure stationary open-loop control policy is
asymptotically optimal within the larger class of all causal control policies.
For multihypothesis testing in the fixed sample size setting, lower and upper
bounds on the optimal error exponent are derived. It is also shown through an
example with three hypotheses that the optimal causal control policy can be
strictly better than the optimal open-loop control policy. In the sequential
setting, a test based on earlier work by Chernoff for binary hypothesis
testing, is shown to be first-order asymptotically optimal for multihypothesis
testing in a strong sense, using the notion of decision making risk in place of
the overall probability of error. Another test is also designed to meet hard
risk constrains while retaining asymptotic optimality. The role of past
information and randomization in designing optimal control policies is
discussed.Comment: To appear in the Transactions on Automatic Contro
On the Reliability Function of Distributed Hypothesis Testing Under Optimal Detection
The distributed hypothesis testing problem with full side-information is
studied. The trade-off (reliability function) between the two types of error
exponents under limited rate is studied in the following way. First, the
problem is reduced to the problem of determining the reliability function of
channel codes designed for detection (in analogy to a similar result which
connects the reliability function of distributed lossless compression and
ordinary channel codes). Second, a single-letter random-coding bound based on a
hierarchical ensemble, as well as a single-letter expurgated bound, are derived
for the reliability of channel-detection codes. Both bounds are derived for a
system which employs the optimal detection rule. We conjecture that the
resulting random-coding bound is ensemble-tight, and consequently optimal
within the class of quantization-and-binning schemes
Theoretical Bounds in Minimax Decentralized Hypothesis Testing
Minimax decentralized detection is studied under two scenarios: with and
without a fusion center when the source of uncertainty is the Bayesian prior.
When there is no fusion center, the constraints in the network design are
determined. Both for a single decision maker and multiple decision makers, the
maximum loss in detection performance due to minimax decision making is
obtained. In the presence of a fusion center, the maximum loss of detection
performance between with- and without fusion center networks is derived
assuming that both networks are minimax robust. The results are finally
generalized.Comment: Submitted to IEEE Trans. on Signal Processin
Channel Detection in Coded Communication
We consider the problem of block-coded communication, where in each block,
the channel law belongs to one of two disjoint sets. The decoder is aimed to
decode only messages that have undergone a channel from one of the sets, and
thus has to detect the set which contains the prevailing channel. We begin with
the simplified case where each of the sets is a singleton. For any given code,
we derive the optimum detection/decoding rule in the sense of the best
trade-off among the probabilities of decoding error, false alarm, and
misdetection, and also introduce sub-optimal detection/decoding rules which are
simpler to implement. Then, various achievable bounds on the error exponents
are derived, including the exact single-letter characterization of the random
coding exponents for the optimal detector/decoder. We then extend the random
coding analysis to general sets of channels, and show that there exists a
universal detector/decoder which performs asymptotically as well as the optimal
detector/decoder, when tuned to detect a channel from a specific pair of
channels. The case of a pair of binary symmetric channels is discussed in
detail.Comment: Submitted to IEEE Transactions on Information Theor
- …