47 research outputs found

    Atomic detail visualization of photosynthetic membranes with GPU-accelerated ray tracing

    Get PDF
    The cellular process responsible for providing energy for most life on Earth, namely, photosynthetic light-harvesting, requires the cooperation of hundreds of proteins across an organelle, involving length and time scales spanning several orders of magnitude over quantum and classical regimes. Simulation and visualization of this fundamental energy conversion process pose many unique methodological and computational challenges. We present, in two accompanying movies, light-harvesting in the photosynthetic apparatus found in purple bacteria, the so-called chromatophore. The movies are the culmination of three decades of modeling efforts, featuring the collaboration of theoretical, experimental, and computational scientists. We describe the techniques that were used to build, simulate, analyze, and visualize the structures shown in the movies, and we highlight cases where scientific needs spurred the development of new parallel algorithms that efficiently harness GPU accelerators and petascale computers

    Cryo-EM model validation recommendations based on outcomes of the 2019 EMDataResource challenge

    Get PDF
    This paper describes outcomes of the 2019 Cryo-EM Model Challenge. The goals were to (1) assess the quality of models that can be produced from cryogenic electron microscopy (cryo-EM) maps using current modeling software, (2) evaluate reproducibility of modeling results from different software developers and users and (3) compare performance of current metrics used for model evaluation, particularly Fit-to-Map metrics, with focus on near-atomic resolution. Our findings demonstrate the relatively high accuracy and reproducibility of cryo-EM models derived by 13 participating teams from four benchmark maps, including three forming a resolution series (1.8 to 3.1 Å). The results permit specific recommendations to be made about validating near-atomic cryo-EM structures both in the context of individual experiments and structure data archives such as the Protein Data Bank. We recommend the adoption of multiple scoring parameters to provide full and objective annotation and assessment of the model, reflective of the observed cryo-EM map density

    Atoms to phenotypes: Molecular design principles of cellular energy metabolism

    Get PDF
    We report a 100-million atom-scale model of an entire cell organelle, a photosynthetic chromatophore vesicle from a purple bacterium, that reveals the cascade of energy conversion steps culminating in the generation of ATP from sunlight. Molecular dynamics simulations of this vesicle elucidate how the integral membrane complexes influence local curvature to tune photoexcitation of pigments. Brownian dynamics of small molecules within the chromatophore probe the mechanisms of directional charge transport under various pH and salinity conditions. Reproducing phenotypic properties from atomistic details, a kinetic model evinces that low-light adaptations of the bacterium emerge as a spontaneous outcome of optimizing the balance between the chromatophore’s structural integrity and robust energy conversion. Parallels are drawn with the more universal mitochondrial bioenergetic machinery, from whence molecular-scale insights into the mechanism of cellular aging are inferred. Together, our integrative method and spectroscopic experiments pave the way to first-principles modeling of whole living cells

    NAMD_Large_System_Simulations_Chromatophore

    No full text
    Contained herein are scripts for making all-atom models of a Rhodobacter sphaeroides chromatophore. Also, configuration scripts for Molecular Dynamics (MD) and Brownian Dynamics (BD) simulations of the aforementioned model using the simulation package NAMD. Additionally, there are scripts to run APBS calculations. Below are links to the chromatophore structure file, gridpdb file and map file required to run the NAMD simulation. https://drive.google.com/a/asu.edu/file/d/10gVADYl320HJiNoe4ilagAKoI7UeR638/view?usp=sharing https://drive.google.com/a/asu.edu/file/d/1MZpm3ABXIPjfbKYLOZOSKbmQJZx4yma1/view?usp=sharing https://drive.google.com/a/asu.edu/file/d/1v9Ni3OlrG45qfA5bO2meIc5xwxT8A1En/view?usp=sharing Below here is a link to the Movies for the submitted manuscript. https://drive.google.com/a/asu.edu/file/d/1N4GmSsWBGxJVrYXVWBHhP-Stf5ASWkZx/view?usp=sharin

    Order parameters for macromolecules: Application to multiscale simulation

    No full text
    Order parameters (OPs) characterizing the nanoscale features of macromolecules are presented. They are generated in a general fashion so that they do not need to be redesigned with each new application. They evolve on time scales much longer than 10−14 s typical for individual atomic collisions∕vibrations. The list of OPs can be automatically increased, and completeness can be determined via a correlation analysis. They serve as the basis of a multiscale analysis that starts with the N-atom Liouville equation and yields rigorous Smoluchowski∕Langevin equations of stochastic OP dynamics. Such OPs and the multiscale analysis imply computational algorithms that we demonstrate in an application to ribonucleic acid structural dynamics for 50 ns

    Multiscale Macromolecular Simulation: Role of Evolving Ensembles

    No full text
    Multiscale analysis provides an algorithm for the efficient simulation of macromolecular assemblies. This algorithm involves the coevolution of a quasiequilibrium probability density of atomic configurations and the Langevin dynamics of spatial coarse-grained variables denoted order parameters (OPs) characterizing nanoscale system features. In practice, implementation of the probability density involves the generation of constant OP ensembles of atomic configurations. Such ensembles are used to construct thermal forces and diffusion factors that mediate the stochastic OP dynamics. Generation of all-atom ensembles at every Langevin time step is computationally expensive. Here, multiscale computation for macromolecular systems is made more efficient by a method that self-consistently folds in ensembles of all-atom configurations constructed in an earlier step, history, of the Langevin evolution. This procedure accounts for the temporal evolution of these ensembles, accurately providing thermal forces and diffusions. It is shown that efficiency and accuracy of the OP-based simulations is increased via the integration of this historical information. Accuracy improves with the square root of the number of historical timesteps included in the calculation. As a result, CPU usage can be decreased by a factor of 3–8 without loss of accuracy. The algorithm is implemented into our existing force-field based multiscale simulation platform and demonstrated via the structural dynamics of viral capsomers

    Multiscale Macromolecular Simulation: Role of Evolving Ensembles

    No full text
    Multiscale analysis provides an algorithm for the efficient simulation of macromolecular assemblies. This algorithm involves the coevolution of a quasiequilibrium probability density of atomic configurations and the Langevin dynamics of spatial coarse-grained variables denoted order parameters (OPs) characterizing nanoscale system features. In practice, implementation of the probability density involves the generation of constant OP ensembles of atomic configurations. Such ensembles are used to construct thermal forces and diffusion factors that mediate the stochastic OP dynamics. Generation of all-atom ensembles at every Langevin time step is computationally expensive. Here, multiscale computation for macromolecular systems is made more efficient by a method that self-consistently folds in ensembles of all-atom configurations constructed in an earlier step, history, of the Langevin evolution. This procedure accounts for the temporal evolution of these ensembles, accurately providing thermal forces and diffusions. It is shown that efficiency and accuracy of the OP-based simulations is increased via the integration of this historical information. Accuracy improves with the square root of the number of historical timesteps included in the calculation. As a result, CPU usage can be decreased by a factor of 3–8 without loss of accuracy. The algorithm is implemented into our existing force-field based multiscale simulation platform and demonstrated via the structural dynamics of viral capsomers
    corecore