2,975 research outputs found
Random Coding Error Exponents for the Two-User Interference Channel
This paper is about deriving lower bounds on the error exponents for the
two-user interference channel under the random coding regime for several
ensembles. Specifically, we first analyze the standard random coding ensemble,
where the codebooks are comprised of independently and identically distributed
(i.i.d.) codewords. For this ensemble, we focus on optimum decoding, which is
in contrast to other, suboptimal decoding rules that have been used in the
literature (e.g., joint typicality decoding, treating interference as noise,
etc.). The fact that the interfering signal is a codeword, rather than an
i.i.d. noise process, complicates the application of conventional techniques of
performance analysis of the optimum decoder. Also, unfortunately, these
conventional techniques result in loose bounds. Using analytical tools rooted
in statistical physics, as well as advanced union bounds, we derive
single-letter formulas for the random coding error exponents. We compare our
results with the best known lower bound on the error exponent, and show that
our exponents can be strictly better. Then, in the second part of this paper,
we consider more complicated coding ensembles, and find a lower bound on the
error exponent associated with the celebrated Han-Kobayashi (HK) random coding
ensemble, which is based on superposition coding.Comment: accepted IEEE Transactions on Information Theor
Error Performance of Channel Coding in Random Access Communication
A new channel coding approach was proposed in [1] for random multiple access
communication over the discrete-time memoryless channel. The coding approach
allows users to choose their communication rates independently without sharing
the rate information among each other or with the receiver. The receiver will
either decode the message or report a collision depending on whether reliable
message recovery is possible. It was shown that, asymptotically as the codeword
length goes to infinity, the set of communication rates supporting reliable
message recovery can be characterized by an achievable region which equals
Shannon's information rate region possibly without a convex hull operation. In
this paper, we derive achievable bounds on error probabilities, including the
decoding error probability and the collision miss detection probability, of
random multiple access systems with a finite codeword length. Achievable error
exponents are obtained by taking the codeword length to infinity.Comment: submitted to IEEE Transactions on Information Theor
Interference Mitigation in Large Random Wireless Networks
A central problem in the operation of large wireless networks is how to deal
with interference -- the unwanted signals being sent by transmitters that a
receiver is not interested in. This thesis looks at ways of combating such
interference.
In Chapters 1 and 2, we outline the necessary information and communication
theory background, including the concept of capacity. We also include an
overview of a new set of schemes for dealing with interference known as
interference alignment, paying special attention to a channel-state-based
strategy called ergodic interference alignment.
In Chapter 3, we consider the operation of large regular and random networks
by treating interference as background noise. We consider the local performance
of a single node, and the global performance of a very large network.
In Chapter 4, we use ergodic interference alignment to derive the asymptotic
sum-capacity of large random dense networks. These networks are derived from a
physical model of node placement where signal strength decays over the distance
between transmitters and receivers. (See also arXiv:1002.0235 and
arXiv:0907.5165.)
In Chapter 5, we look at methods of reducing the long time delays incurred by
ergodic interference alignment. We analyse the tradeoff between reducing delay
and lowering the communication rate. (See also arXiv:1004.0208.)
In Chapter 6, we outline a problem that is equivalent to the problem of
pooled group testing for defective items. We then present some new work that
uses information theoretic techniques to attack group testing. We introduce for
the first time the concept of the group testing channel, which allows for
modelling of a wide range of statistical error models for testing. We derive
new results on the number of tests required to accurately detect defective
items, including when using sequential `adaptive' tests.Comment: PhD thesis, University of Bristol, 201
Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities
This monograph presents a unified treatment of single- and multi-user
problems in Shannon's information theory where we depart from the requirement
that the error probability decays asymptotically in the blocklength. Instead,
the error probabilities for various problems are bounded above by a
non-vanishing constant and the spotlight is shone on achievable coding rates as
functions of the growing blocklengths. This represents the study of asymptotic
estimates with non-vanishing error probabilities.
In Part I, after reviewing the fundamentals of information theory, we discuss
Strassen's seminal result for binary hypothesis testing where the type-I error
probability is non-vanishing and the rate of decay of the type-II error
probability with growing number of independent observations is characterized.
In Part II, we use this basic hypothesis testing result to develop second- and
sometimes, even third-order asymptotic expansions for point-to-point
communication. Finally in Part III, we consider network information theory
problems for which the second-order asymptotics are known. These problems
include some classes of channels with random state, the multiple-encoder
distributed lossless source coding (Slepian-Wolf) problem and special cases of
the Gaussian interference and multiple-access channels. Finally, we discuss
avenues for further research.Comment: Further comments welcom
- …