5 research outputs found
Data is often loadable in short depth: Quantum circuits from tensor networks for finance, images, fluids, and proteins
Though there has been substantial progress in developing quantum algorithms
to study classical datasets, the cost of simply loading classical data is an
obstacle to quantum advantage. When the amplitude encoding is used, loading an
arbitrary classical vector requires up to exponential circuit depths with
respect to the number of qubits. Here, we address this ``input problem'' with
two contributions. First, we introduce a circuit compilation method based on
tensor network (TN) theory. Our method -- AMLET (Automatic Multi-layer Loader
Exploiting TNs) -- proceeds via careful construction of a specific TN topology
and can be tailored to arbitrary circuit depths. Second, we perform numerical
experiments on real-world classical data from four distinct areas: finance,
images, fluid mechanics, and proteins. To the best of our knowledge, this is
the broadest numerical analysis to date of loading classical data into a
quantum computer. Consistent with other recent work in this area, the required
circuit depths are often several orders of magnitude lower than the
exponentially-scaling general loading algorithm would require. Besides
introducing a more efficient loading algorithm, this work demonstrates that
many classical datasets are loadable in depths that are much shorter than
previously expected, which has positive implications for speeding up classical
workloads on quantum computers.Comment: 10 pages, 3 figure
HamLib: A library of Hamiltonians for benchmarking quantum algorithms and hardware
In order to characterize and benchmark computational hardware, software, and
algorithms, it is essential to have many problem instances on-hand. This is no
less true for quantum computation, where a large collection of real-world
problem instances would allow for benchmarking studies that in turn help to
improve both algorithms and hardware designs. To this end, here we present a
large dataset of qubit-based quantum Hamiltonians. The dataset, called HamLib
(for Hamiltonian Library), is freely available online and contains problem
sizes ranging from 2 to 1000 qubits. HamLib includes problem instances of the
Heisenberg model, Fermi-Hubbard model, Bose-Hubbard model, molecular electronic
structure, molecular vibrational structure, MaxCut, Max-k-SAT, Max-k-Cut,
QMaxCut, and the traveling salesperson problem. The goals of this effort are
(a) to save researchers time by eliminating the need to prepare problem
instances and map them to qubit representations, (b) to allow for more thorough
tests of new algorithms and hardware, and (c) to allow for reproducibility and
standardization across research studies
Encoding trade-offs and design toolkits in quantum algorithms for discrete optimization: coloring, routing, scheduling, and other problems
Challenging combinatorial optimization problems are ubiquitous in science and
engineering. Several quantum methods for optimization have recently been
developed, in different settings including both exact and approximate solvers.
Addressing this field of research, this manuscript has three distinct purposes.
First, we present an intuitive method for synthesizing and analyzing discrete
(i.e., integer-based) optimization problems, wherein the problem and
corresponding algorithmic primitives are expressed using a discrete quantum
intermediate representation (DQIR) that is encoding-independent. This compact
representation often allows for more efficient problem compilation, automated
analyses of different encoding choices, easier interpretability, more complex
runtime procedures, and richer programmability, as compared to previous
approaches, which we demonstrate with a number of examples. Second, we perform
numerical studies comparing several qubit encodings; the results exhibit a
number of preliminary trends that help guide the choice of encoding for a
particular set of hardware and a particular problem and algorithm. Our study
includes problems related to graph coloring, the traveling salesperson problem,
factory/machine scheduling, financial portfolio rebalancing, and integer linear
programming. Third, we design low-depth graph-derived partial mixers (GDPMs) up
to 16-level quantum variables, demonstrating that compact (binary) encodings
are more amenable to QAOA than previously understood. We expect this toolkit of
programming abstractions and low-level building blocks to aid in designing
quantum algorithms for discrete combinatorial problems.Comment: 46 pages; 11 figure
Recommended from our members
HamLib: A Library of Hamiltonians for Benchmarking Quantum Algorithms and Hardware
For a considerable time, large datasets containing problem instances have proven valuable for analyzing computer hardware, software, and algorithms. One notable example of the value of large datasets is ImageNet [1], a vast repository of images that has been instrumental in testing numerous deep learning packages. Similarly, in the domain of computational chemistry and materials science, the availability of extensive datasets such as the Protein Data Bank [2], the Materials Project [3], and QM9 [4] has greatly facilitated the evaluation of new algorithms and software approaches, while also promoting standardization within the field. These well-defined datasets and problem instances, in turn, serve as the foundation for creating benchmarking suites like MLPerf [5] and LINPACK [6], [7]. These suites enable fair and rigorous comparisons of different methodologies and solutions, fostering continuous advancements in various areas of computer science and beyond