3 research outputs found
Machine-learning-assisted Monte Carlo fails at sampling computationally hard problems
Several strategies have been recently proposed in order to improve Monte
Carlo sampling efficiency using machine learning tools. Here, we challenge
these methods by considering a class of problems that are known to be
exponentially hard to sample using conventional local Monte Carlo at low enough
temperatures. In particular, we study the antiferromagnetic Potts model on a
random graph, which reduces to the coloring of random graphs at zero
temperature. We test several machine-learning-assisted Monte Carlo approaches,
and we find that they all fail. Our work thus provide good benchmarks for
future proposals for smart sampling algorithms
Efficient generative modeling of protein sequences using simple autoregressive models
Generative models emerge as promising candidates for novel sequence-data
driven approaches to protein design, and for the extraction of structural and
functional information about proteins deeply hidden in rapidly growing sequence
databases. Here we propose simple autoregressive models as highly accurate but
computationally efficient generative sequence models. We show that they perform
similarly to existing approaches based on Boltzmann machines or deep generative
models, but at a substantially lower computational cost (by a factor between
and ). Furthermore, the simple structure of our models has
distinctive mathematical advantages, which translate into an improved
applicability in sequence generation and evaluation. Within these models, we
can easily estimate both the probability of a given sequence, and, using the
model's entropy, the size of the functional sequence space related to a
specific protein family. In the example of response regulators, we find a huge
number of ca. possible sequences, which nevertheless constitute only
the astronomically small fraction of all amino-acid sequences of the
same length. These findings illustrate the potential and the difficulty in
exploring sequence space via generative sequence models.Comment: 12 pages, 4 Figures + Supplementary Materia