8,316 research outputs found
Rate-distance tradeoff for codes above graph capacity
The capacity of a graph is defined as the rate of exponential growth of
independent sets in the strong powers of the graph. In the strong power an edge
connects two sequences if at each position their letters are equal or adjacent.
We consider a variation of the problem where edges in the power graphs are
removed between sequences which differ in more than a fraction of
coordinates. The proposed generalization can be interpreted as the problem of
determining the highest rate of zero undetected-error communication over a link
with adversarial noise, where only a fraction of symbols can be
perturbed and only some substitutions are allowed.
We derive lower bounds on achievable rates by combining graph homomorphisms
with a graph-theoretic generalization of the Gilbert-Varshamov bound. We then
give an upper bound, based on Delsarte's linear programming approach, which
combines Lov\'asz' theta function with the construction used by McEliece et al.
for bounding the minimum distance of codes in Hamming spaces.Comment: 5 pages. Presented at 2016 IEEE International Symposium on
Information Theor
Interference Mitigation in Large Random Wireless Networks
A central problem in the operation of large wireless networks is how to deal
with interference -- the unwanted signals being sent by transmitters that a
receiver is not interested in. This thesis looks at ways of combating such
interference.
In Chapters 1 and 2, we outline the necessary information and communication
theory background, including the concept of capacity. We also include an
overview of a new set of schemes for dealing with interference known as
interference alignment, paying special attention to a channel-state-based
strategy called ergodic interference alignment.
In Chapter 3, we consider the operation of large regular and random networks
by treating interference as background noise. We consider the local performance
of a single node, and the global performance of a very large network.
In Chapter 4, we use ergodic interference alignment to derive the asymptotic
sum-capacity of large random dense networks. These networks are derived from a
physical model of node placement where signal strength decays over the distance
between transmitters and receivers. (See also arXiv:1002.0235 and
arXiv:0907.5165.)
In Chapter 5, we look at methods of reducing the long time delays incurred by
ergodic interference alignment. We analyse the tradeoff between reducing delay
and lowering the communication rate. (See also arXiv:1004.0208.)
In Chapter 6, we outline a problem that is equivalent to the problem of
pooled group testing for defective items. We then present some new work that
uses information theoretic techniques to attack group testing. We introduce for
the first time the concept of the group testing channel, which allows for
modelling of a wide range of statistical error models for testing. We derive
new results on the number of tests required to accurately detect defective
items, including when using sequential `adaptive' tests.Comment: PhD thesis, University of Bristol, 201
On Secure Distributed Data Storage Under Repair Dynamics
We address the problem of securing distributed storage systems against
passive eavesdroppers that can observe a limited number of storage nodes. An
important aspect of these systems is node failures over time, which demand a
repair mechanism aimed at maintaining a targeted high level of system
reliability. If an eavesdropper observes a node that is added to the system to
replace a failed node, it will have access to all the data downloaded during
repair, which can potentially compromise the entire information in the system.
We are interested in determining the secrecy capacity of distributed storage
systems under repair dynamics, i.e., the maximum amount of data that can be
securely stored and made available to a legitimate user without revealing any
information to any eavesdropper. We derive a general upper bound on the secrecy
capacity and show that this bound is tight for the bandwidth-limited regime
which is of importance in scenarios such as peer-to-peer distributed storage
systems. We also provide a simple explicit code construction that achieves the
capacity for this regime.Comment: 5 pages, 4 figures, to appear in Proceedings of IEEE ISIT 201
On palimpsests in neural memory: an information theory viewpoint
The finite capacity of neural memory and the
reconsolidation phenomenon suggest it is important to be able
to update stored information as in a palimpsest, where new
information overwrites old information. Moreover, changing
information in memory is metabolically costly. In this paper, we
suggest that information-theoretic approaches may inform the
fundamental limits in constructing such a memory system. In
particular, we define malleable coding, that considers not only
representation length but also ease of representation update,
thereby encouraging some form of recycling to convert an old
codeword into a new one. Malleability cost is the difficulty of
synchronizing compressed versions, and malleable codes are of
particular interest when representing information and modifying
the representation are both expensive. We examine the tradeoff
between compression efficiency and malleability cost, under a
malleability metric defined with respect to a string edit distance.
This introduces a metric topology to the compressed domain. We
characterize the exact set of achievable rates and malleability as
the solution of a subgraph isomorphism problem. This is all done
within the optimization approach to biology framework.Accepted manuscrip
High rate locally-correctable and locally-testable codes with sub-polynomial query complexity
In this work, we construct the first locally-correctable codes (LCCs), and
locally-testable codes (LTCs) with constant rate, constant relative distance,
and sub-polynomial query complexity. Specifically, we show that there exist
binary LCCs and LTCs with block length , constant rate (which can even be
taken arbitrarily close to 1), constant relative distance, and query complexity
. Previously such codes were known to exist
only with query complexity (for constant ), and
there were several, quite different, constructions known.
Our codes are based on a general distance-amplification method of Alon and
Luby~\cite{AL96_codes}. We show that this method interacts well with local
correctors and testers, and obtain our main results by applying it to suitably
constructed LCCs and LTCs in the non-standard regime of \emph{sub-constant
relative distance}.
Along the way, we also construct LCCs and LTCs over large alphabets, with the
same query complexity , which additionally have
the property of approaching the Singleton bound: they have almost the
best-possible relationship between their rate and distance. This has the
surprising consequence that asking for a large alphabet error-correcting code
to further be an LCC or LTC with query
complexity does not require any sacrifice in terms of rate and distance! Such a
result was previously not known for any query complexity.
Our results on LCCs also immediately give locally-decodable codes (LDCs) with
the same parameters
- …