10,805 research outputs found
Limits on Support Recovery with Probabilistic Models: An Information-Theoretic Framework
The support recovery problem consists of determining a sparse subset of a set
of variables that is relevant in generating a set of observations, and arises
in a diverse range of settings such as compressive sensing, and subset
selection in regression, and group testing. In this paper, we take a unified
approach to support recovery problems, considering general probabilistic models
relating a sparse data vector to an observation vector. We study the
information-theoretic limits of both exact and partial support recovery, taking
a novel approach motivated by thresholding techniques in channel coding. We
provide general achievability and converse bounds characterizing the trade-off
between the error probability and number of measurements, and we specialize
these to the linear, 1-bit, and group testing models. In several cases, our
bounds not only provide matching scaling laws in the necessary and sufficient
number of measurements, but also sharp thresholds with matching constant
factors. Our approach has several advantages over previous approaches: For the
achievability part, we obtain sharp thresholds under broader scalings of the
sparsity level and other parameters (e.g., signal-to-noise ratio) compared to
several previous works, and for the converse part, we not only provide
conditions under which the error probability fails to vanish, but also
conditions under which it tends to one.Comment: Accepted to IEEE Transactions on Information Theory; presented in
part at ISIT 2015 and SODA 201
Quantum Cryptography Beyond Quantum Key Distribution
Quantum cryptography is the art and science of exploiting quantum mechanical
effects in order to perform cryptographic tasks. While the most well-known
example of this discipline is quantum key distribution (QKD), there exist many
other applications such as quantum money, randomness generation, secure two-
and multi-party computation and delegated quantum computation. Quantum
cryptography also studies the limitations and challenges resulting from quantum
adversaries---including the impossibility of quantum bit commitment, the
difficulty of quantum rewinding and the definition of quantum security models
for classical primitives. In this review article, aimed primarily at
cryptographers unfamiliar with the quantum world, we survey the area of
theoretical quantum cryptography, with an emphasis on the constructions and
limitations beyond the realm of QKD.Comment: 45 pages, over 245 reference
Strictly contractive quantum channels and physically realizable quantum computers
We study the robustness of quantum computers under the influence of errors
modelled by strictly contractive channels. A channel is defined to be
strictly contractive if, for any pair of density operators in its
domain, for some (here denotes the trace norm). In other words, strictly
contractive channels render the states of the computer less distinguishable in
the sense of quantum detection theory. Starting from the premise that all
experimental procedures can be carried out with finite precision, we argue that
there exists a physically meaningful connection between strictly contractive
channels and errors in physically realizable quantum computers. We show that,
in the absence of error correction, sensitivity of quantum memories and
computers to strictly contractive errors grows exponentially with storage time
and computation time respectively, and depends only on the constant and the
measurement precision. We prove that strict contractivity rules out the
possibility of perfect error correction, and give an argument that approximate
error correction, which covers previous work on fault-tolerant quantum
computation as a special case, is possible.Comment: 14 pages; revtex, amsfonts, amssymb; made some changes (recommended
by Phys. Rev. A), updated the reference
Broadcasting on Random Directed Acyclic Graphs
We study a generalization of the well-known model of broadcasting on trees.
Consider a directed acyclic graph (DAG) with a unique source vertex , and
suppose all other vertices have indegree . Let the vertices at
distance from be called layer . At layer , is given a random
bit. At layer , each vertex receives bits from its parents in
layer , which are transmitted along independent binary symmetric channel
edges, and combines them using a -ary Boolean processing function. The goal
is to reconstruct with probability of error bounded away from using
the values of all vertices at an arbitrarily deep layer. This question is
closely related to models of reliable computation and storage, and information
flow in biological networks.
In this paper, we analyze randomly constructed DAGs, for which we show that
broadcasting is only possible if the noise level is below a certain degree and
function dependent critical threshold. For , and random DAGs with
layer sizes and majority processing functions, we identify the
critical threshold. For , we establish a similar result for NAND
processing functions. We also prove a partial converse for odd
illustrating that the identified thresholds are impossible to improve by
selecting different processing functions if the decoder is restricted to using
a single vertex.
Finally, for any noise level, we construct explicit DAGs (using expander
graphs) with bounded degree and layer sizes admitting
reconstruction. In particular, we show that such DAGs can be generated in
deterministic quasi-polynomial time or randomized polylogarithmic time in the
depth. These results portray a doubly-exponential advantage for storing a bit
in DAGs compared to trees, where but layer sizes must grow exponentially
with depth in order to enable broadcasting.Comment: 33 pages, double column format. arXiv admin note: text overlap with
arXiv:1803.0752
The capacity of non-identical adaptive group testing
We consider the group testing problem, in the case where the items are
defective independently but with non-constant probability. We introduce and
analyse an algorithm to solve this problem by grouping items together
appropriately. We give conditions under which the algorithm performs
essentially optimally in the sense of information-theoretic capacity. We use
concentration of measure results to bound the probability that this algorithm
requires many more tests than the expected number. This has applications to the
allocation of spectrum to cognitive radios, in the case where a database gives
prior information that a particular band will be occupied.Comment: To be presented at Allerton 201
- …