3,777 research outputs found
Using Quantum Computers for Quantum Simulation
Numerical simulation of quantum systems is crucial to further our
understanding of natural phenomena. Many systems of key interest and
importance, in areas such as superconducting materials and quantum chemistry,
are thought to be described by models which we cannot solve with sufficient
accuracy, neither analytically nor numerically with classical computers. Using
a quantum computer to simulate such quantum systems has been viewed as a key
application of quantum computation from the very beginning of the field in the
1980s. Moreover, useful results beyond the reach of classical computation are
expected to be accessible with fewer than a hundred qubits, making quantum
simulation potentially one of the earliest practical applications of quantum
computers. In this paper we survey the theoretical and experimental development
of quantum simulation using quantum computers, from the first ideas to the
intense research efforts currently underway.Comment: 43 pages, 136 references, review article, v2 major revisions in
response to referee comments, v3 significant revisions, identical to
published version apart from format, ArXiv version has table of contents and
references in alphabetical orde
What is a quantum computer, and how do we build one?
The DiVincenzo criteria for implementing a quantum computer have been seminal
in focussing both experimental and theoretical research in quantum information
processing. These criteria were formulated specifically for the circuit model
of quantum computing. However, several new models for quantum computing
(paradigms) have been proposed that do not seem to fit the criteria well. The
question is therefore what are the general criteria for implementing quantum
computers. To this end, a formal operational definition of a quantum computer
is introduced. It is then shown that according to this definition a device is a
quantum computer if it obeys the following four criteria: Any quantum computer
must (1) have a quantum memory; (2) facilitate a controlled quantum evolution
of the quantum memory; (3) include a method for cooling the quantum memory; and
(4) provide a readout mechanism for subsets of the quantum memory. The criteria
are met when the device is scalable and operates fault-tolerantly. We discuss
various existing quantum computing paradigms, and how they fit within this
framework. Finally, we lay out a roadmap for selecting an avenue towards
building a quantum computer. This is summarized in a decision tree intended to
help experimentalists determine the most natural paradigm given a particular
physical implementation
Quantum Computing
Quantum mechanics---the theory describing the fundamental workings of
nature---is famously counterintuitive: it predicts that a particle can be in
two places at the same time, and that two remote particles can be inextricably
and instantaneously linked. These predictions have been the topic of intense
metaphysical debate ever since the theory's inception early last century.
However, supreme predictive power combined with direct experimental observation
of some of these unusual phenomena leave little doubt as to its fundamental
correctness. In fact, without quantum mechanics we could not explain the
workings of a laser, nor indeed how a fridge magnet operates. Over the last
several decades quantum information science has emerged to seek answers to the
question: can we gain some advantage by storing, transmitting and processing
information encoded in systems that exhibit these unique quantum properties?
Today it is understood that the answer is yes. Many research groups around the
world are working towards one of the most ambitious goals humankind has ever
embarked upon: a quantum computer that promises to exponentially improve
computational power for particular tasks. A number of physical systems,
spanning much of modern physics, are being developed for this task---ranging
from single particles of light to superconducting circuits---and it is not yet
clear which, if any, will ultimately prove successful. Here we describe the
latest developments for each of the leading approaches and explain what the
major challenges are for the future.Comment: 26 pages, 7 figures, 291 references. Early draft of Nature 464, 45-53
(4 March 2010). Published version is more up-to-date and has several
corrections, but is half the length with far fewer reference
An a posteriori verification method for generalized real-symmetric eigenvalue problems in large-scale electronic state calculations
An a posteriori verification method is proposed for the generalized
real-symmetric eigenvalue problem and is applied to densely clustered
eigenvalue problems in large-scale electronic state calculations. The proposed
method is realized by a two-stage process in which the approximate solution is
computed by existing numerical libraries and is then verified in a moderate
computational time. The procedure returns intervals containing one exact
eigenvalue in each interval. Test calculations were carried out for organic
device materials, and the verification method confirms that all exact
eigenvalues are well separated in the obtained intervals. This verification
method will be integrated into EigenKernel (https://github.com/eigenkernel/),
which is middleware for various parallel solvers for the generalized eigenvalue
problem. Such an a posteriori verification method will be important in future
computational science.Comment: 15 pages, 7 figure
Roadmap on electronic structure codes in the exascale era
Electronic structure calculations have been instrumental in providing many important insights into a range of physical and chemical properties of various molecular and solid-state systems. Their importance to various fields, including materials science, chemical sciences, computational chemistry, and device physics, is underscored by the large fraction of available public supercomputing resources devoted to these calculations. As we enter the exascale era, exciting new opportunities to increase simulation numbers, sizes, and accuracies present themselves. In order to realize these promises, the community of electronic structure software developers will however first have to tackle a number of challenges pertaining to the efficient use of new architectures that will rely heavily on massive parallelism and hardware accelerators. This roadmap provides a broad overview of the state-of-the-art in electronic structure calculations and of the various new directions being pursued by the community. It covers 14 electronic structure codes, presenting their current status, their development priorities over the next five years, and their plans towards tackling the challenges and leveraging the opportunities presented by the advent of exascale computing
- …