14 research outputs found
Differentiable Programming Tensor Networks
Differentiable programming is a fresh programming paradigm which composes
parameterized algorithmic components and trains them using automatic
differentiation (AD). The concept emerges from deep learning but is not only
limited to training neural networks. We present theory and practice of
programming tensor network algorithms in a fully differentiable way. By
formulating the tensor network algorithm as a computation graph, one can
compute higher order derivatives of the program accurately and efficiently
using AD. We present essential techniques to differentiate through the tensor
networks contractions, including stable AD for tensor decomposition and
efficient backpropagation through fixed point iterations. As a demonstration,
we compute the specific heat of the Ising model directly by taking the second
order derivative of the free energy obtained in the tensor renormalization
group calculation. Next, we perform gradient based variational optimization of
infinite projected entangled pair states for quantum antiferromagnetic
Heisenberg model and obtain start-of-the-art variational energy and
magnetization with moderate efforts. Differentiable programming removes
laborious human efforts in deriving and implementing analytical gradients for
tensor network programs, which opens the door to more innovations in tensor
network algorithms and applications.Comment: Typos corrected, discussion and refs added; revised version accepted
for publication in PRX. Source code available at
https://github.com/wangleiphy/tensorgra
Differentiable programming tensor networks for Kitaev magnets
We present a general computational framework to investigate ground state
properties of quantum spin models on infinite two-dimensional lattices using
automatic differentiation-based gradient optimization of infinite projected
entangled-pair states. The approach exploits the variational uniform matrix
product states to contract infinite tensor networks with unit-cell structure
and incorporates automatic differentiation to optimize the local tensors. We
applied this framework to the Kitaev-type model, which involves complex
interactions and competing ground states. To evaluate the accuracy of this
method, we compared the results with exact solutions for the Kitaev model and
found that it has a better agreement for various observables compared to
previous tensor network calculations based on imaginary-time projection.
Additionally, by finding out the ground state with lower variational energy
compared to previous studies, we provided convincing evidence for the existence
of nematic paramagnetic phases and 18-site configuration in the phase diagram
of the - model. Furthermore, in the case of the realistic
--- model for the Kitaev material -RuCl, we
discovered a non-colinear zigzag ground state. Lastly, we also find that the
strength of the critical out-of-plane magnetic field that suppresses such a
zigzag state has a lower transition field value than the previous
finite-cylinder calculations. The framework is versatile and will be useful for
a quick scan of phase diagrams for a broad class of quantum spin models