338 research outputs found
Interference Mitigation in Large Random Wireless Networks
A central problem in the operation of large wireless networks is how to deal
with interference -- the unwanted signals being sent by transmitters that a
receiver is not interested in. This thesis looks at ways of combating such
interference.
In Chapters 1 and 2, we outline the necessary information and communication
theory background, including the concept of capacity. We also include an
overview of a new set of schemes for dealing with interference known as
interference alignment, paying special attention to a channel-state-based
strategy called ergodic interference alignment.
In Chapter 3, we consider the operation of large regular and random networks
by treating interference as background noise. We consider the local performance
of a single node, and the global performance of a very large network.
In Chapter 4, we use ergodic interference alignment to derive the asymptotic
sum-capacity of large random dense networks. These networks are derived from a
physical model of node placement where signal strength decays over the distance
between transmitters and receivers. (See also arXiv:1002.0235 and
arXiv:0907.5165.)
In Chapter 5, we look at methods of reducing the long time delays incurred by
ergodic interference alignment. We analyse the tradeoff between reducing delay
and lowering the communication rate. (See also arXiv:1004.0208.)
In Chapter 6, we outline a problem that is equivalent to the problem of
pooled group testing for defective items. We then present some new work that
uses information theoretic techniques to attack group testing. We introduce for
the first time the concept of the group testing channel, which allows for
modelling of a wide range of statistical error models for testing. We derive
new results on the number of tests required to accurately detect defective
items, including when using sequential `adaptive' tests.Comment: PhD thesis, University of Bristol, 201
The Ambiguity of Simplicity
A system's apparent simplicity depends on whether it is represented
classically or quantally. This is not so surprising, as classical and quantum
physics are descriptive frameworks built on different assumptions that capture,
emphasize, and express different properties and mechanisms. What is surprising
is that, as we demonstrate, simplicity is ambiguous: the relative simplicity
between two systems can change sign when moving between classical and quantum
descriptions. Thus, notions of absolute physical simplicity---minimal structure
or memory---at best form a partial, not a total, order. This suggests that
appeals to principles of physical simplicity, via Ockham's Razor or to the
"elegance" of competing theories, may be fundamentally subjective, perhaps even
beyond the purview of physics itself. It also raises challenging questions in
model selection between classical and quantum descriptions. Fortunately,
experiments are now beginning to probe measures of simplicity, creating the
potential to directly test for ambiguity.Comment: 7 pages, 6 figures, http://csc.ucdavis.edu/~cmg/compmech/pubs/aos.ht
- …