6,524 research outputs found
Computation of protein geometry and its applications: Packing and function prediction
This chapter discusses geometric models of biomolecules and geometric
constructs, including the union of ball model, the weigthed Voronoi diagram,
the weighted Delaunay triangulation, and the alpha shapes. These geometric
constructs enable fast and analytical computaton of shapes of biomoleculres
(including features such as voids and pockets) and metric properties (such as
area and volume). The algorithms of Delaunay triangulation, computation of
voids and pockets, as well volume/area computation are also described. In
addition, applications in packing analysis of protein structures and protein
function prediction are also discussed.Comment: 32 pages, 9 figure
Improvements to the APBS biomolecular solvation software suite
The Adaptive Poisson-Boltzmann Solver (APBS) software was developed to solve
the equations of continuum electrostatics for large biomolecular assemblages
that has provided impact in the study of a broad range of chemical, biological,
and biomedical applications. APBS addresses three key technology challenges for
understanding solvation and electrostatics in biomedical applications: accurate
and efficient models for biomolecular solvation and electrostatics, robust and
scalable software for applying those theories to biomolecular systems, and
mechanisms for sharing and analyzing biomolecular electrostatics data in the
scientific community. To address new research applications and advancing
computational capabilities, we have continually updated APBS and its suite of
accompanying software since its release in 2001. In this manuscript, we discuss
the models and capabilities that have recently been implemented within the APBS
software package including: a Poisson-Boltzmann analytical and a
semi-analytical solver, an optimized boundary element solver, a geometry-based
geometric flow solvation model, a graph theory based algorithm for determining
p values, and an improved web-based visualization tool for viewing
electrostatics
Many-Task Computing and Blue Waters
This report discusses many-task computing (MTC) generically and in the
context of the proposed Blue Waters systems, which is planned to be the largest
NSF-funded supercomputer when it begins production use in 2012. The aim of this
report is to inform the BW project about MTC, including understanding aspects
of MTC applications that can be used to characterize the domain and
understanding the implications of these aspects to middleware and policies.
Many MTC applications do not neatly fit the stereotypes of high-performance
computing (HPC) or high-throughput computing (HTC) applications. Like HTC
applications, by definition MTC applications are structured as graphs of
discrete tasks, with explicit input and output dependencies forming the graph
edges. However, MTC applications have significant features that distinguish
them from typical HTC applications. In particular, different engineering
constraints for hardware and software must be met in order to support these
applications. HTC applications have traditionally run on platforms such as
grids and clusters, through either workflow systems or parallel programming
systems. MTC applications, in contrast, will often demand a short time to
solution, may be communication intensive or data intensive, and may comprise
very short tasks. Therefore, hardware and software for MTC must be engineered
to support the additional communication and I/O and must minimize task dispatch
overheads. The hardware of large-scale HPC systems, with its high degree of
parallelism and support for intensive communication, is well suited for MTC
applications. However, HPC systems often lack a dynamic resource-provisioning
feature, are not ideal for task communication via the file system, and have an
I/O system that is not optimized for MTC-style applications. Hence, additional
software support is likely to be required to gain full benefit from the HPC
hardware
Monte-Carlo Simulations of Soft Matter Using SIMONA: A Review of Recent Applications
Molecular simulations such as Molecular Dynamics (MD) and Monte Carlo (MC) have gained increasing importance in the explanation of various physicochemical and biochemical phenomena in soft matter and help elucidate processes that often cannot be understood by experimental techniques alone. While there is a large number of computational studies and developments in MD, MC simulations are less widely used, but they offer a powerful alternative approach to explore the potential energy surface of complex systems in a way that is not feasible for atomistic MD, which still remains fundamentally constrained by the femtosecond timestep, limiting investigations of many essential processes. This paper provides a review of the current developments of a MC based code, SIMONA, which is an efficient and versatile tool to perform large-scale conformational sampling of different kinds of (macro)molecules. We provide an overview of the approach, and an application to soft-matter problems, such as protocols for protein and polymer folding, physical vapor deposition of functional organic molecules and complex oligomer modeling. SIMONA offers solutions to different levels of programming expertise (basic, expert and developer level) through the usage of a designed Graphical Interface pre-processor, a convenient coding environment using XML and the development of new algorithms using Python/C++. We believe that the development of versatile codes which can be used in different fields, along with related protocols and data analysis, paves the way for wider use of MC methods
MOLNs: A cloud platform for interactive, reproducible and scalable spatial stochastic computational experiments in systems biology using PyURDME
Computational experiments using spatial stochastic simulations have led to
important new biological insights, but they require specialized tools, a
complex software stack, as well as large and scalable compute and data analysis
resources due to the large computational cost associated with Monte Carlo
computational workflows. The complexity of setting up and managing a
large-scale distributed computation environment to support productive and
reproducible modeling can be prohibitive for practitioners in systems biology.
This results in a barrier to the adoption of spatial stochastic simulation
tools, effectively limiting the type of biological questions addressed by
quantitative modeling. In this paper, we present PyURDME, a new, user-friendly
spatial modeling and simulation package, and MOLNs, a cloud computing appliance
for distributed simulation of stochastic reaction-diffusion models. MOLNs is
based on IPython and provides an interactive programming platform for
development of sharable and reproducible distributed parallel computational
experiments
Elaborating Transition Interface Sampling Methods
We review two recently developed efficient methods for calculating rate
constants of processes dominated by rare events in high-dimensional complex
systems. The first is transition interface sampling (TIS), based on the
measurement of effective fluxes through hypersurfaces in phase space. TIS
improves efficiency with respect to standard transition path sampling (TPS)
rate constant techniques, because it allows a variable path length and is less
sensitive to recrossings. The second method is the partial path version of TIS.
Developed for diffusive processes, it exploits the loss of long time
correlation. We discuss the relation between the new techniques and the
standard reactive flux methods in detail. Path sampling algorithms can suffer
from ergodicity problems, and we introduce several new techniques to alleviate
these problems, notably path swapping, stochastic configurational bias Monte
Carlo shooting moves and order-parameter free path sampling. In addition, we
give algorithms to calculate other interesting properties from path ensembles
besides rate constants, such as activation energies and reaction mechanisms.Comment: 36 pages, 5 figure
- …