4,694 research outputs found
The quantum query complexity of certification
We study the quantum query complexity of finding a certificate for a
d-regular, k-level balanced NAND formula. Up to logarithmic factors, we show
that the query complexity is Theta(d^{(k+1)/2}) for 0-certificates, and
Theta(d^{k/2}) for 1-certificates. In particular, this shows that the
zero-error quantum query complexity of evaluating such formulas is
O(d^{(k+1)/2}) (again neglecting a logarithmic factor). Our lower bound relies
on the fact that the quantum adversary method obeys a direct sum theorem.Comment: 8 pages; Updated to reflect changes in final journal version and to
point out that the main result only applies for k>
Optimal Direct Sum Results for Deterministic and Randomized Decision Tree Complexity
A Direct Sum Theorem holds in a model of computation, when solving some k
input instances together is k times as expensive as solving one. We show that
Direct Sum Theorems hold in the models of deterministic and randomized decision
trees for all relations. We also note that a near optimal Direct Sum Theorem
holds for quantum decision trees for boolean functions.Comment: 7 page
Weak Parity
We study the query complexity of Weak Parity: the problem of computing the
parity of an n-bit input string, where one only has to succeed on a 1/2+eps
fraction of input strings, but must do so with high probability on those inputs
where one does succeed. It is well-known that n randomized queries and n/2
quantum queries are needed to compute parity on all inputs. But surprisingly,
we give a randomized algorithm for Weak Parity that makes only
O(n/log^0.246(1/eps)) queries, as well as a quantum algorithm that makes only
O(n/sqrt(log(1/eps))) queries. We also prove a lower bound of
Omega(n/log(1/eps)) in both cases; and using extremal combinatorics, prove
lower bounds of Omega(log n) in the randomized case and Omega(sqrt(log n)) in
the quantum case for any eps>0. We show that improving our lower bounds is
intimately related to two longstanding open problems about Boolean functions:
the Sensitivity Conjecture, and the relationships between query complexity and
polynomial degree.Comment: 18 page
Quantum query complexity of state conversion
State conversion generalizes query complexity to the problem of converting
between two input-dependent quantum states by making queries to the input. We
characterize the complexity of this problem by introducing a natural
information-theoretic norm that extends the Schur product operator norm. The
complexity of converting between two systems of states is given by the distance
between them, as measured by this norm.
In the special case of function evaluation, the norm is closely related to
the general adversary bound, a semi-definite program that lower-bounds the
number of input queries needed by a quantum algorithm to evaluate a function.
We thus obtain that the general adversary bound characterizes the quantum query
complexity of any function whatsoever. This generalizes and simplifies the
proof of the same result in the case of boolean input and output. Also in the
case of function evaluation, we show that our norm satisfies a remarkable
composition property, implying that the quantum query complexity of the
composition of two functions is at most the product of the query complexities
of the functions, up to a constant. Finally, our result implies that discrete
and continuous-time query models are equivalent in the bounded-error setting,
even for the general state-conversion problem.Comment: 19 pages, 2 figures; heavily revised with new results and simpler
proof
Distributional Property Testing in a Quantum World
A fundamental problem in statistics and learning theory is to test properties of distributions. We show that quantum computers can solve such problems with significant speed-ups. We also introduce a novel access model for quantum distributions, enabling the coherent preparation of quantum samples, and propose a general framework that can naturally handle both classical and quantum distributions in a unified manner. Our framework generalizes and improves previous quantum algorithms for testing closeness between unknown distributions, testing independence between two distributions, and estimating the Shannon / von Neumann entropy of distributions. For classical distributions our algorithms significantly improve the precision dependence of some earlier results. We also show that in our framework procedures for classical distributions can be directly lifted to the more general case of quantum distributions, and thus obtain the first speed-ups for testing properties of density operators that can be accessed coherently rather than only via sampling
Robust self-testing of many-qubit states
We introduce a simple two-player test which certifies that the players apply
tensor products of Pauli and observables on the tensor
product of EPR pairs. The test has constant robustness: any strategy
achieving success probability within an additive of the optimal
must be -close, in the appropriate distance
measure, to the honest -qubit strategy. The test involves -bit questions
and -bit answers. The key technical ingredient is a quantum version of the
classical linearity test of Blum, Luby, and Rubinfeld.
As applications of our result we give (i) the first robust self-test for
EPR pairs; (ii) a quantum multiprover interactive proof system for the local
Hamiltonian problem with a constant number of provers and classical questions
and answers, and a constant completeness-soundness gap independent of system
size; (iii) a robust protocol for delegated quantum computation.Comment: 36 pages. Improves upon and supersedes our earlier submission
arXiv:1512.0209
Trading locality for time: certifiable randomness from low-depth circuits
The generation of certifiable randomness is the most fundamental
information-theoretic task that meaningfully separates quantum devices from
their classical counterparts. We propose a protocol for exponential certified
randomness expansion using a single quantum device. The protocol calls for the
device to implement a simple quantum circuit of constant depth on a 2D lattice
of qubits. The output of the circuit can be verified classically in linear
time, and is guaranteed to contain a polynomial number of certified random bits
assuming that the device used to generate the output operated using a
(classical or quantum) circuit of sub-logarithmic depth. This assumption
contrasts with the locality assumption used for randomness certification based
on Bell inequality violation or computational assumptions. To demonstrate
randomness generation it is sufficient for a device to sample from the ideal
output distribution within constant statistical distance.
Our procedure is inspired by recent work of Bravyi et al. (Science 2018), who
introduced a relational problem that can be solved by a constant-depth quantum
circuit, but provably cannot be solved by any classical circuit of
sub-logarithmic depth. We develop the discovery of Bravyi et al. into a
framework for robust randomness expansion. Our proposal does not rest on any
complexity-theoretic conjectures, but relies on the physical assumption that
the adversarial device being tested implements a circuit of sub-logarithmic
depth. Success on our task can be easily verified in classical linear time.
Finally, our task is more noise-tolerant than most other existing proposals
that can only tolerate multiplicative error, or require additional conjectures
from complexity theory; in contrast, we are able to allow a small constant
additive error in total variation distance between the sampled and ideal
distributions.Comment: 36 pages, 2 figure
- …