1,009 research outputs found
Structure learning of antiferromagnetic Ising models
In this paper we investigate the computational complexity of learning the
graph structure underlying a discrete undirected graphical model from i.i.d.
samples. We first observe that the notoriously difficult problem of learning
parities with noise can be captured as a special case of learning graphical
models. This leads to an unconditional computational lower bound of for learning general graphical models on nodes of maximum degree
, for the class of so-called statistical algorithms recently introduced by
Feldman et al (2013). The lower bound suggests that the runtime
required to exhaustively search over neighborhoods cannot be significantly
improved without restricting the class of models.
Aside from structural assumptions on the graph such as it being a tree,
hypertree, tree-like, etc., many recent papers on structure learning assume
that the model has the correlation decay property. Indeed, focusing on
ferromagnetic Ising models, Bento and Montanari (2009) showed that all known
low-complexity algorithms fail to learn simple graphs when the interaction
strength exceeds a number related to the correlation decay threshold. Our
second set of results gives a class of repelling (antiferromagnetic) models
that have the opposite behavior: very strong interaction allows efficient
learning in time . We provide an algorithm whose performance
interpolates between and depending on the strength of the
repulsion.Comment: 15 pages. NIPS 201
Deep neural networks for direct, featureless learning through observation: the case of 2d spin models
We demonstrate the capability of a convolutional deep neural network in
predicting the nearest-neighbor energy of the 4x4 Ising model. Using its
success at this task, we motivate the study of the larger 8x8 Ising model,
showing that the deep neural network can learn the nearest-neighbor Ising
Hamiltonian after only seeing a vanishingly small fraction of configuration
space. Additionally, we show that the neural network has learned both the
energy and magnetization operators with sufficient accuracy to replicate the
low-temperature Ising phase transition. We then demonstrate the ability of the
neural network to learn other spin models, teaching the convolutional deep
neural network to accurately predict the long-range interaction of a screened
Coulomb Hamiltonian, a sinusoidally attenuated screened Coulomb Hamiltonian,
and a modified Potts model Hamiltonian. In the case of the long-range
interaction, we demonstrate the ability of the neural network to recover the
phase transition with equivalent accuracy to the numerically exact method.
Furthermore, in the case of the long-range interaction, the benefits of the
neural network become apparent; it is able to make predictions with a high
degree of accuracy, and do so 1600 times faster than a CUDA-optimized exact
calculation. Additionally, we demonstrate how the neural network succeeds at
these tasks by looking at the weights learned in a simplified demonstration
Charge-Density-Wave Transitions of Dirac Fermions Coupled to Phonons
The spontaneous generation of charge-density-wave order in a Dirac fermion
system via the natural mechanism of electron-phonon coupling is studied in the
framework of the Holstein model on the honeycomb lattice. Using two independent
and unbiased quantum Monte Carlo methods, the phase diagram as a function of
temperature and coupling strength is determined. It features a quantum critical
point as well as a line of thermal critical points. Finite-size scaling appears
consistent with fermionic Gross-Neveu-Ising universality for the quantum phase
transition, and bosonic Ising universality for the thermal phase transition.
The critical temperature has a maximum at intermediate couplings. Our findings
motivate experimental efforts to identify or engineer Dirac systems with
sufficiently strong and tunable electron-phonon coupling.Comment: 4+3 pages, 4+2 figure
- …