154 research outputs found
Investigating Topological Order using Recurrent Neural Networks
Recurrent neural networks (RNNs), originally developed for natural language
processing, hold great promise for accurately describing strongly correlated
quantum many-body systems. Here, we employ 2D RNNs to investigate two
prototypical quantum many-body Hamiltonians exhibiting topological order.
Specifically, we demonstrate that RNN wave functions can effectively capture
the topological order of the toric code and a Bose-Hubbard spin liquid on the
kagome lattice by estimating their topological entanglement entropies. We also
find that RNNs favor coherent superpositions of minimally-entangled states over
minimally-entangled states themselves. Overall, our findings demonstrate that
RNN wave functions constitute a powerful tool to study phases of matter beyond
Landau's symmetry-breaking paradigm.Comment: 14 pages, 7 figures, 1 table. A version with new correction
Supplementing Recurrent Neural Network Wave Functions with Symmetry and Annealing to Improve Accuracy
Recurrent neural networks (RNNs) are a class of neural networks that have
emerged from the paradigm of artificial intelligence and has enabled lots of
interesting advances in the field of natural language processing.
Interestingly, these architectures were shown to be powerful ansatze to
approximate the ground state of quantum systems. Here, we build over the
results of [Phys. Rev. Research 2, 023358 (2020)] and construct a more powerful
RNN wave function ansatz in two dimensions. We use symmetry and annealing to
obtain accurate estimates of ground state energies of the two-dimensional (2D)
Heisenberg model, on the square lattice and on the triangular lattice. We show
that our method is superior to Density Matrix Renormalisation Group (DMRG) for
system sizes larger than or equal to on the triangular lattice.Comment: 11 pages, 4 figures, 1 table. Corrected typos. Originally published
in Machine Learning and the Physical Sciences Workshop (NeurIPS 2021), see:
https://ml4physicalsciences.github.io/2021/files/NeurIPS_ML4PS_2021_92.pdf.
Our reproducibility code can be found at
https://github.com/mhibatallah/RNNWavefunction
Variational Neural Annealing
Many important challenges in science and technology can be cast as
optimization problems. When viewed in a statistical physics framework, these
can be tackled by simulated annealing, where a gradual cooling procedure helps
search for groundstate solutions of a target Hamiltonian. While powerful,
simulated annealing is known to have prohibitively slow sampling dynamics when
the optimization landscape is rough or glassy. Here we show that by
generalizing the target distribution with a parameterized model, an analogous
annealing framework based on the variational principle can be used to search
for groundstate solutions. Modern autoregressive models such as recurrent
neural networks provide ideal parameterizations since they can be exactly
sampled without slow dynamics even when the model encodes a rough landscape. We
implement this procedure in the classical and quantum settings on several
prototypical spin glass Hamiltonians, and find that it significantly
outperforms traditional simulated annealing in the asymptotic limit,
illustrating the potential power of this yet unexplored route to optimization.Comment: 19 pages, 9 figures, 1 tabl
- …