151 research outputs found

    Adversarial reverse mapping of equilibrated condensed-phase molecular structures

    Get PDF
    A tight and consistent link between resolutions is crucial to further expand the impact of multiscale modeling for complex materials. We herein tackle the generation of condensed molecular structures as a refinement—backmapping—of a coarse-grained (CG) structure. Traditional schemes start from a rough coarse-to-fine mapping and perform further energy minimization and molecular dynamics simulations to equilibrate the system. In this study we introduce DeepBackmap: A deep neural network based approach to directly predict equilibrated molecular structures for condensed-phase systems. We use generative adversarial networks to learn the Boltzmann distribution from training data and realize reverse mapping by using the CG structure as a conditional input. We apply our method to a challenging condensed-phase polymeric system. We observe that the model trained in a melt has remarkable transferability to the crystalline phase. The combination of data-driven and physics-based aspects of our architecture help reach temperature transferability with only limited training data

    Interpretable embeddings from molecular simulations using Gaussian mixture variational autoencoders

    Get PDF
    Extracting insight from the enormous quantity of data generated from molecular simulations requires the identification of a small number of collective variables whose corresponding low-dimensional free-energy landscape retains the essential features of the underlying system. Data-driven techniques provide a systematic route to constructing this landscape, without the need for extensive a priori intuition into the relevant driving forces. In particular, autoencoders are powerful tools for dimensionality reduction, as they naturally force an information bottleneck and, thereby, a low-dimensional embedding of the essential features. While variational autoencoders ensure continuity of the embedding by assuming a unimodal Gaussian prior, this is at odds with the multi-basin free-energy landscapes that typically arise from the identification of meaningful collective variables. In this work, we incorporate this physical intuition into the prior by employing a Gaussian mixture variational autoencoder (GMVAE), which encourages the separation of metastable states within the embedding. The GMVAE performs dimensionality reduction and clustering within a single unified framework, and is capable of identifying the inherent dimensionality of the input data, in terms of the number of Gaussians required to categorize the data. We illustrate our approach on two toy models, alanine dipeptide, and a challenging disordered peptide ensemble, demonstrating the enhanced clustering effect of the GMVAE prior compared to standard VAEs. The resulting embeddings appear to be promising representations for constructing Markov state models, highlighting the transferability of the dimensionality reduction from static equilibrium properties to dynamics
    • …
    corecore