34,064 research outputs found
Quantum replication at the Heisenberg limit
No process in nature can perfectly clone an arbitrary quantum state. But is
it possible to engineer processes that replicate quantum information with
vanishingly small error? Here we demonstrate the possibility of probabilistic
super-replication phenomena where N equally prepared quantum clocks are
transformed into a much larger number of M nearly perfect replicas, with an
error that rapidly vanishes whenever M is small compared to the square of N.
The quadratic replication rate is the ultimate limit imposed by Quantum
Mechanics to the proliferation of information and is fundamentally linked with
the Heisenberg limit of quantum metrology.Comment: 9 + 16 pages, 2 figures, published versio
Reduction of Markov Chains using a Value-of-Information-Based Approach
In this paper, we propose an approach to obtain reduced-order models of
Markov chains. Our approach is composed of two information-theoretic processes.
The first is a means of comparing pairs of stationary chains on different state
spaces, which is done via the negative Kullback-Leibler divergence defined on a
model joint space. Model reduction is achieved by solving a
value-of-information criterion with respect to this divergence. Optimizing the
criterion leads to a probabilistic partitioning of the states in the high-order
Markov chain. A single free parameter that emerges through the optimization
process dictates both the partition uncertainty and the number of state groups.
We provide a data-driven means of choosing the `optimal' value of this free
parameter, which sidesteps needing to a priori know the number of state groups
in an arbitrary chain.Comment: Submitted to Entrop
Fuzzy Extractors: How to Generate Strong Keys from Biometrics and Other Noisy Data
We provide formal definitions and efficient secure techniques for
- turning noisy information into keys usable for any cryptographic
application, and, in particular,
- reliably and securely authenticating biometric data.
Our techniques apply not just to biometric information, but to any keying
material that, unlike traditional cryptographic keys, is (1) not reproducible
precisely and (2) not distributed uniformly. We propose two primitives: a
"fuzzy extractor" reliably extracts nearly uniform randomness R from its input;
the extraction is error-tolerant in the sense that R will be the same even if
the input changes, as long as it remains reasonably close to the original.
Thus, R can be used as a key in a cryptographic application. A "secure sketch"
produces public information about its input w that does not reveal w, and yet
allows exact recovery of w given another value that is close to w. Thus, it can
be used to reliably reproduce error-prone biometric inputs without incurring
the security risk inherent in storing them.
We define the primitives to be both formally secure and versatile,
generalizing much prior work. In addition, we provide nearly optimal
constructions of both primitives for various measures of ``closeness'' of input
data, such as Hamming distance, edit distance, and set difference.Comment: 47 pp., 3 figures. Prelim. version in Eurocrypt 2004, Springer LNCS
3027, pp. 523-540. Differences from version 3: minor edits for grammar,
clarity, and typo
Local Exchangeability
Exchangeability---in which the distribution of an infinite sequence is
invariant to reorderings of its elements---implies the existence of a simple
conditional independence structure that may be leveraged in the design of
probabilistic models, efficient inference algorithms, and randomization-based
testing procedures. In practice, however, this assumption is too strong an
idealization; the distribution typically fails to be exactly invariant to
permutations and de Finetti's representation theory does not apply. Thus there
is the need for a distributional assumption that is both weak enough to hold in
practice, and strong enough to guarantee a useful underlying representation. We
introduce a relaxed notion of local exchangeability---where swapping data
associated with nearby covariates causes a bounded change in the distribution.
We prove that locally exchangeable processes correspond to independent
observations from an underlying measure-valued stochastic process. We thereby
show that de Finetti's theorem is robust to perturbation and provide further
justification for the Bayesian modelling approach. Using this probabilistic
result, we develop three novel statistical procedures for (1) estimating the
underlying process via local empirical measures, (2) testing via local
randomization, and (3) estimating the canonical premetric of local
exchangeability. These three procedures extend the applicability of previous
exchangeability-based methods without sacrificing rigorous statistical
guarantees. The paper concludes with examples of popular statistical models
that exhibit local exchangeability
Probabilistic Spectral Sparsification In Sublinear Time
In this paper, we introduce a variant of spectral sparsification, called
probabilistic -spectral sparsification. Roughly speaking,
it preserves the cut value of any cut with an
multiplicative error and a additive error. We show how
to produce a probabilistic -spectral sparsifier with
edges in time
time for unweighted undirected graph. This gives fastest known sub-linear time
algorithms for different cut problems on unweighted undirected graph such as
- An time -approximation
algorithm for the sparsest cut problem and the balanced separator problem.
- A time approximation minimum s-t cut algorithm
with an additive error
Short seed extractors against quantum storage
Some, but not all, extractors resist adversaries with limited quantum
storage. In this paper we show that Trevisan's extractor has this property,
thereby showing an extractor against quantum storage with logarithmic seed
length
Improved Quantum Communication Complexity Bounds for Disjointness and Equality
We prove new bounds on the quantum communication complexity of the
disjointness and equality problems. For the case of exact and non-deterministic
protocols we show that these complexities are all equal to n+1, the previous
best lower bound being n/2. We show this by improving a general bound for
non-deterministic protocols of de Wolf. We also give an O(sqrt{n}c^{log^*
n})-qubit bounded-error protocol for disjointness, modifying and improving the
earlier O(sqrt{n}log n) protocol of Buhrman, Cleve, and Wigderson, and prove an
Omega(sqrt{n}) lower bound for a large class of protocols that includes the
BCW-protocol as well as our new protocol.Comment: 11 pages LaTe
- …