19 research outputs found
Second-Order Asymptotics for the Discrete Memoryless MAC with Degraded Message Sets
This paper studies the second-order asymptotics of the discrete memoryless
multiple-access channel with degraded message sets. For a fixed average error
probability and an arbitrary point on the boundary of the
capacity region, we characterize the speed of convergence of rate pairs that
converge to that point for codes that have asymptotic error probability no
larger than , thus complementing an analogous result given previously
for the Gaussian setting.Comment: 5 Pages, 1 Figure. Follow-up paper of http://arxiv.org/abs/1310.1197.
Accepted to ISIT 201
First- and Second-Order Hypothesis Testing for Mixed Memoryless Sources with General Mixture
The first- and second-order optimum achievable exponents in the simple
hypothesis testing problem are investigated. The optimum achievable exponent
for type II error probability, under the constraint that the type I error
probability is allowed asymptotically up to epsilon, is called the
epsilon-optimum exponent. In this paper, we first give the second-order
epsilon-exponent in the case where the null hypothesis and the alternative
hypothesis are a mixed memoryless source and a stationary memoryless source,
respectively. We next generalize this setting to the case where the alternative
hypothesis is also a mixed memoryless source. We address the first-order
epsilon-optimum exponent in this setting. In addition, an extension of our
results to more general setting such as the hypothesis testing with mixed
general source and the relationship with the general compound hypothesis
testing problem are also discussed.Comment: 23 page
Recommended from our members
Sharp Second-Order Pointwise Asymptotics for Lossless Compression with Side Information
The problem of determining the best achievable performance of arbitrary lossless compression algorithms is examined, when correlated side information is available at both the encoder and decoder. For arbitrary source-side information pairs, the conditional information density is shown to provide a sharp asymptotic lower bound for the description lengths achieved by an arbitrary sequence of compressors. This implies that for ergodic source-side information pairs, the conditional entropy rate is the best achievable asymptotic lower bound to the rate, not just in expectation but with probability one. Under appropriate mixing conditions, a central limit theorem and a law of the iterated logarithm are proved, describing the inevitable fluctuations of the second-order asymptotically best possible rate. An idealised version of Lempel-Ziv coding with side information is shown to be universally first- and second-order asymptotically optimal, under the same conditions. These results are in part based on a new almost-sure invariance principle for the conditional information density, which may be of independent interest