1,502 research outputs found
Quantifying configuration-sampling error in Langevin simulations of complex molecular systems
While Langevin integrators are popular in the study of equilibrium properties of complex systems, it is challenging to estimate the timestep-induced discretization error: the degree to which the sampled phase-space or configuration-space probability density departs from the desired target density due to the use of a finite integration timestep. Sivak et al., introduced a convenient approach to approximating a natural measure of error between the sampled density and the target equilibrium density, the Kullback-Leibler (KL) divergence, in phase space, but did not specifically address the issue of configuration-space properties, which are much more commonly of interest in molecular simulations. Here, we introduce a variant of this near-equilibrium estimator capable of measuring the error in the configuration-space marginal density, validating it against a complex but exact nested Monte Carlo estimator to show that it reproduces the KL divergence with high fidelity. To illustrate its utility, we employ this new near-equilibrium estimator to assess a claim that a recently proposed Langevin integrator introduces extremely small configuration-space density errors up to the stability limit at no extra computational expense. Finally, we show how this approach to quantifying sampling bias can be applied to a wide variety of stochastic integrators by following a straightforward procedure to compute the appropriate shadow work, and describe how it can be extended to quantify the error in arbitrary marginal or conditional distributions of interest
Efficient equilibrium sampling of all-atom peptides using library-based Monte Carlo
We applied our previously developed library-based Monte Carlo (LBMC) to
equilibrium sampling of several implicitly solvated all-atom peptides. LBMC can
perform equilibrium sampling of molecules using the pre-calculated statistical
libraries of molecular-fragment configurations and energies. For this study, we
employed residue-based fragments distributed according to the Boltzmann factor
of the OPLS-AA forcefield describing the individual fragments. Two solvent
models were employed: a simple uniform dielectric and the Generalized
Born/Surface Area (GBSA) model. The efficiency of LBMC was compared to standard
Langevin dynamics (LD) using three different statistical tools. The statistical
analyses indicate that LBMC is more than 100 times faster than LD not only for
the simple solvent model but also for GBSA.Comment: 5 figure
Extending fragment-based free energy calculations with library Monte Carlo simulation: Annealing in interaction space
Pre-calculated libraries of molecular fragment configurations have previously
been used as a basis for both equilibrium sampling (via "library-based Monte
Carlo") and for obtaining absolute free energies using a polymer-growth
formalism. Here, we combine the two approaches to extend the size of systems
for which free energies can be calculated. We study a series of all-atom
poly-alanine systems in a simple dielectric "solvent" and find that precise
free energies can be obtained rapidly. For instance, for 12 residues, less than
an hour of single-processor is required. The combined approach is formally
equivalent to the "annealed importance sampling" algorithm; instead of
annealing by decreasing temperature, however, interactions among fragments are
gradually added as the molecule is "grown." We discuss implications for future
binding affinity calculations in which a ligand is grown into a binding site
Two mathematical tools to analyze metastable stochastic processes
We present how entropy estimates and logarithmic Sobolev inequalities on the
one hand, and the notion of quasi-stationary distribution on the other hand,
are useful tools to analyze metastable overdamped Langevin dynamics, in
particular to quantify the degree of metastability. We discuss the interest of
these approaches to estimate the efficiency of some classical algorithms used
to speed up the sampling, and to evaluate the error introduced by some
coarse-graining procedures. This paper is a summary of a plenary talk given by
the author at the ENUMATH 2011 conference
Coherent States Formulation of Polymer Field Theory
We introduce a stable and efficient complex Langevin (CL) scheme to enable
the first numerical simulations of the coherent-states (CS) formulation of
polymer field theory. In contrast with Edwards' well known auxiliary-field (AF)
framework, the CS formulation does not contain an embedded non-linear,
non-local functional of the auxiliary fields, and the action of the field
theory has a fully explicit, finite-order and semi-local polynomial character.
In the context of a polymer solution model, we demonstrate that the new CS-CL
dynamical scheme for sampling fluctuations in the space of coherent states
yields results in good agreement with now-standard AF simulations. The
formalism is potentially applicable to a broad range of polymer architectures
and may facilitate systematic generation of trial actions for use in
coarse-graining and numerical renormalization-group studies.Comment: 14pages 8 figure
Forward Flux Sampling for rare event simulations
Rare events are ubiquitous in many different fields, yet they are notoriously
difficult to simulate because few, if any, events are observed in a conventiona
l simulation run. Over the past several decades, specialised simulation methods
have been developed to overcome this problem. We review one recently-developed
class of such methods, known as Forward Flux Sampling. Forward Flux Sampling
uses a series of interfaces between the initial and final states to calculate
rate constants and generate transition paths, for rare events in equilibrium or
nonequilibrium systems with stochastic dynamics. This review draws together a
number of recent advances, summarizes several applications of the method and
highlights challenges that remain to be overcome.Comment: minor typos in the manuscript. J.Phys.:Condensed Matter (accepted for
publication
- …