73 research outputs found
Preparing thermal states of quantum systems by dimension reduction
We present an algorithm that prepares thermal Gibbs states of one dimensional
quantum systems on a quantum computer without any memory overhead, and in a
time significantly shorter than other known alternatives. Specifically, the
time complexity is dominated by the quantity , where is the
size of the system, is a bound on the operator norm of the local terms
of the Hamiltonian (coupling energy), and is the temperature. Given other
results on the complexity of thermalization, this overall scaling is likely
optimal. For higher dimensions, our algorithm lowers the known scaling of the
time complexity with the dimension of the system by one.Comment: Published version. Minor editorial changes, one new reference added.
4 pages, 1 figur
Fast Quantum Methods for Optimization
Discrete combinatorial optimization consists in finding the optimal
configuration that minimizes a given discrete objective function. An
interpretation of such a function as the energy of a classical system allows us
to reduce the optimization problem into the preparation of a low-temperature
thermal state of the system. Motivated by the quantum annealing method, we
present three strategies to prepare the low-temperature state that exploit
quantum mechanics in remarkable ways. We focus on implementations without
uncontrolled errors induced by the environment. This allows us to rigorously
prove a quantum advantage. The first strategy uses a classical-to-quantum
mapping, where the equilibrium properties of a classical system in spatial
dimensions can be determined from the ground state properties of a quantum
system also in spatial dimensions. We show how such a ground state can be
prepared by means of quantum annealing, including quantum adiabatic evolutions.
This mapping also allows us to unveil some fundamental relations between
simulated and quantum annealing. The second strategy builds upon the first one
and introduces a technique called spectral gap amplification to reduce the time
required to prepare the same quantum state adiabatically. If implemented on a
quantum device that exploits quantum coherence, this strategy leads to a
quadratic improvement in complexity over the well-known bound of the classical
simulated annealing method. The third strategy is not purely adiabatic;
instead, it exploits diabatic processes between the low-energy states of the
corresponding quantum system. For some problems it results in an exponential
speedup (in the oracle model) over the best classical algorithms.Comment: 15 pages (2 figures
Entanglement and complexity of interacting qubits subject to asymmetric noise
The simulation complexity of predicting the time evolution of delocalized
many-body quantum systems has attracted much recent interest, and simulations
of such systems in real quantum hardware are promising routes to demonstrating
a quantum advantage over classical machines. In these proposals, random noise
is an obstacle that must be overcome for a faithful simulation, and a single
error event can be enough to drive the system to a classically trivial state.
We argue that this need not always be the case, and consider a modification to
a leading quantum sampling problem-- time evolution in an interacting
Bose-Hubbard chain of transmon qubits [Neill et al, Science 2018] -- where each
site in the chain has a driven coupling to a lossy resonator and particle
number is no longer conserved. The resulting quantum dynamics are complex and
highly nontrivial. We argue that this problem is harder to simulate than the
isolated chain, and that it can achieve volume-law entanglement even in the
strong noise limit, likely persisting up to system sizes beyond the scope of
classical simulation. Further, we show that the metrics which suggest classical
intractability for the isolated chain point to similar conclusions in the noisy
case. These results suggest that quantum sampling problems including nontrivial
noise could be good candidates for demonstrating a quantum advantage in
near-term hardware.Comment: 20 pages, 15 figure
Eigenstate preparation by phase decoherence
A computation in adiabatic quantum computing is implemented by traversing a path of nondegenerate eigenstates of a continuous family of Hamiltonians. We introduce a method that traverses a discretized form of the path: at each step we apply the instantaneous Hamiltonian for a random time. The resulting decoherence approximates a projective measurement onto the desired eigenstate, achieving a version of the quantum Zeno effect. The average cost of our method is O(L^2/Δ) for constant error probability, where L is the length of the path of eigenstates and Δ is the minimum spectral gap of the Hamiltonian. For many cases of interest, L does not depend on Δ so the scaling of the cost with the gap is better than the one obtained in rigorous proofs of the adiabatic theorem. We give an example where this situation occurs
Boundaries of quantum supremacy via random circuit sampling
Google's recent quantum supremacy experiment heralded a transition point
where quantum computing performed a computational task, random circuit
sampling, that is beyond the practical reach of modern supercomputers. We
examine the constraints of the observed quantum runtime advantage in an
analytical extrapolation to circuits with a larger number of qubits and gates.
Due to the exponential decrease of the experimental fidelity with the number of
qubits and gates, we demonstrate for current fidelities a theoretical classical
runtime advantage for circuits beyond a depth of 100, while quantum runtimes
for cross-entropy benchmarking limit the region of a quantum advantage to
around 300 qubits. However, the quantum runtime advantage boundary grows
exponentially with reduced error rates, and our work highlights the importance
of continued progress along this line. Extrapolations of measured error rates
suggest that the limiting circuit size for which a computationally feasible
quantum runtime advantage in cross-entropy benchmarking can be achieved
approximately coincides with expectations for early implementations of the
surface code and other quantum error correction methods. Thus the boundaries of
quantum supremacy via random circuit sampling may fortuitously coincide with
the advent of scalable, error corrected quantum computing in the near term.Comment: 8 pages, 3 figure
- …