10,948 research outputs found
On the role of entanglement in quantum computational speed-up
For any quantum algorithm operating on pure states we prove that the presence
of multi-partite entanglement, with a number of parties that increases
unboundedly with input size, is necessary if the quantum algorithm is to offer
an exponential speed-up over classical computation. Furthermore we prove that
the algorithm can be classically efficiently simulated to within a prescribed
tolerance \eta even if a suitably small amount of global entanglement
(depending on \eta) is present. We explicitly identify the occurrence of
increasing multi-partite entanglement in Shor's algorithm. Our results do not
apply to quantum algorithms operating on mixed states in general and we discuss
the suggestion that an exponential computational speed-up might be possible
with mixed states in the total absence of entanglement. Finally, despite the
essential role of entanglement for pure state algorithms, we argue that it is
nevertheless misleading to view entanglement as a key resource for quantum
computational power.Comment: Main proofs simplified. A few further explanatory remarks added. 22
pages, plain late
Adiabatic quantum computation and quantum phase transitions
We analyze the ground state entanglement in a quantum adiabatic evolution
algorithm designed to solve the NP-complete Exact Cover problem. The entropy of
entanglement seems to obey linear and universal scaling at the point where the
mass gap becomes small, suggesting that the system passes near a quantum phase
transition. Such a large scaling of entanglement suggests that the effective
connectivity of the system diverges as the number of qubits goes to infinity
and that this algorithm cannot be efficiently simulated by classical means. On
the other hand, entanglement in Grover's algorithm is bounded by a constant.Comment: 5 pages, 4 figures, accepted for publication in PR
On the Necessity of Entanglement for the Explanation of Quantum Speedup
In this paper I argue that entanglement is a necessary component for any
explanation of quantum speedup and I address some purported counter-examples
that some claim show that the contrary is true. In particular, I address Biham
et al.'s mixed-state version of the Deutsch-Jozsa algorithm, and Knill &
Laflamme's deterministic quantum computation with one qubit (DQC1) model of
quantum computation. I argue that these examples do not demonstrate that
entanglement is unnecessary for the explanation of quantum speedup, but that
they rather illuminate and clarify the role that entanglement does play.Comment: Many clarificatory changes, and improved argumentation. Comments and
criticisms are still welcom
Reflections on the Role of Entanglement in the Explanation of Quantum Computational Speedup
Of the many and varied applications of quantum information theory, perhaps the most fascinating is the sub-field of quantum computation. In this sub-field, computational algorithms are designed which utilise the resources available in quantum systems in order to compute solutions to computational problems with, in some cases, exponentially fewer resources than any known classical algorithm. While the fact of quantum computational speedup is almost beyond doubt, the source of quantum speedup is still a matter of debate. In this paper I argue that entanglement is a necessary component for any explanation of quantum speedup and I address some purported counter-examples that some claim show that the contrary is true. In particular, I address Biham et al.'s mixed-state version of the Deutsch-Jozsa algorithm, and Knill \& Laflamme's deterministic quantum computation with one qubit (DQC1) model of quantum computation. I argue that these examples do not demonstrate that entanglement is unnecessary for the explanation of quantum speedup, but that they rather illuminate and clarify the role that entanglement does play
Universal quantum computation with little entanglement
We show that universal quantum computation can be achieved in the standard
pure-state circuit model while, at any time, the entanglement entropy of all
bipartitions is small---even tending to zero with growing system size. The
result is obtained by showing that a quantum computer operating within a small
region around the set of unentangled states still has universal computational
power, and by using continuity of entanglement entropy. In fact an analogous
conclusion applies to every entanglement measure which is continuous in a
certain natural sense, which amounts to a large class. Other examples include
the geometric measure, localizable entanglement, smooth epsilon-measures,
multipartite concurrence, squashed entanglement, and several others. We discuss
implications of these results for the believed role of entanglement as a key
necessary resource for quantum speed-ups
Entanglement and its Role in Shor's Algorithm
Entanglement has been termed a critical resource for quantum information
processing and is thought to be the reason that certain quantum algorithms,
such as Shor's factoring algorithm, can achieve exponentially better
performance than their classical counterparts. The nature of this resource is
still not fully understood: here we use numerical simulation to investigate how
entanglement between register qubits varies as Shor's algorithm is run on a
quantum computer. The shifting patterns in the entanglement are found to relate
to the choice of basis for the quantum Fourier transform.Comment: 15 pages, 4 eps figures, v1-3 were for conference proceedings (not
included in the end); v4 is improved following referee comments, expanded
explanations and added reference
A Computational Model for Quantum Measurement
Is the dynamical evolution of physical systems objectively a manifestation of
information processing by the universe? We find that an affirmative answer has
important consequences for the measurement problem. In particular, we calculate
the amount of quantum information processing involved in the evolution of
physical systems, assuming a finite degree of fine-graining of Hilbert space.
This assumption is shown to imply that there is a finite capacity to sustain
the immense entanglement that measurement entails. When this capacity is
overwhelmed, the system's unitary evolution becomes computationally unstable
and the system suffers an information transition (`collapse'). Classical
behaviour arises from the rapid cycles of unitary evolution and information
transitions.
Thus, the fine-graining of Hilbert space determines the location of the
`Heisenberg cut', the mesoscopic threshold separating the microscopic, quantum
system from the macroscopic, classical environment. The model can be viewed as
a probablistic complement to decoherence, that completes the measurement
process by turning decohered improper mixtures of states into proper mixtures.
It is shown to provide a natural resolution to the measurement problem and the
basis problem.Comment: 24 pages; REVTeX4; published versio
Efficient classical simulation of slightly entangled quantum computations
We present a scheme to efficiently simulate, with a classical computer, the
dynamics of multipartite quantum systems on which the amount of entanglement
(or of correlations in the case of mixed-state dynamics) is conveniently
restricted. The evolution of a pure state of n qubits can be simulated by using
computational resources that grow linearly in n and exponentially in the
entanglement. We show that a pure-state quantum computation can only yield an
exponential speed-up with respect to classical computations if the entanglement
increases with the size n of the computation, and gives a lower bound on the
required growth.Comment: 4 pages. Major changes. Significantly improved simulation schem
- …