743 research outputs found

    Computation of protein geometry and its applications: Packing and function prediction

    Full text link
    This chapter discusses geometric models of biomolecules and geometric constructs, including the union of ball model, the weigthed Voronoi diagram, the weighted Delaunay triangulation, and the alpha shapes. These geometric constructs enable fast and analytical computaton of shapes of biomoleculres (including features such as voids and pockets) and metric properties (such as area and volume). The algorithms of Delaunay triangulation, computation of voids and pockets, as well volume/area computation are also described. In addition, applications in packing analysis of protein structures and protein function prediction are also discussed.Comment: 32 pages, 9 figure

    Forward Flux Sampling for rare event simulations

    Full text link
    Rare events are ubiquitous in many different fields, yet they are notoriously difficult to simulate because few, if any, events are observed in a conventiona l simulation run. Over the past several decades, specialised simulation methods have been developed to overcome this problem. We review one recently-developed class of such methods, known as Forward Flux Sampling. Forward Flux Sampling uses a series of interfaces between the initial and final states to calculate rate constants and generate transition paths, for rare events in equilibrium or nonequilibrium systems with stochastic dynamics. This review draws together a number of recent advances, summarizes several applications of the method and highlights challenges that remain to be overcome.Comment: minor typos in the manuscript. J.Phys.:Condensed Matter (accepted for publication

    Nanoinformatics

    Get PDF
    Machine learning; Big data; Atomic resolution characterization; First-principles calculations; Nanomaterials synthesi

    A Review of Computational Methods in Materials Science: Examples from Shock-Wave and Polymer Physics

    Get PDF
    This review discusses several computational methods used on different length and time scales for the simulation of material behavior. First, the importance of physical modeling and its relation to computer simulation on multiscales is discussed. Then, computational methods used on different scales are shortly reviewed, before we focus on the molecular dynamics (MD) method. Here we survey in a tutorial-like fashion some key issues including several MD optimization techniques. Thereafter, computational examples for the capabilities of numerical simulations in materials research are discussed. We focus on recent results of shock wave simulations of a solid which are based on two different modeling approaches and we discuss their respective assets and drawbacks with a view to their application on multiscales. Then, the prospects of computer simulations on the molecular length scale using coarse-grained MD methods are covered by means of examples pertaining to complex topological polymer structures including star-polymers, biomacromolecules such as polyelectrolytes and polymers with intrinsic stiffness. This review ends by highlighting new emerging interdisciplinary applications of computational methods in the field of medical engineering where the application of concepts of polymer physics and of shock waves to biological systems holds a lot of promise for improving medical applications such as extracorporeal shock wave lithotripsy or tumor treatment

    Optimal experimental designs for the exploration of reaction kinetic phase diagrams

    Get PDF

    Modelling and simulation of diesel catalytic dewaxing reactors

    Get PDF
    Designing catalytic dewaxing reactors is a major challenge in petroleum refineries due to the lack of kinetic studies related to this operation. Also, the measurements of the cloud point of diesel fuels produced from those units are still being carried out using inaccurate visual procedures, which bring difficulties to the process control design. In this thesis, a single event kinetic model is developed for catalytic dewaxing on Pt/ZSM-5, which application has not been explored in the scientific literature. A total of 14 kinetic parameters have been estimated from experimental data, which are independent on the feedstock type. Then, the obtained parameters were used to propose a soft-sensor consisting of three different modules that handles the feedstock distillation data and integrates a mechanistic reactor model to a solid-liquid flash algorithm to predict the cloud point of the diesel product. Finally, a surrogate model is developed using a sequential design of experiments to simplify the sensor framework and reduce the computing time so that it can be industrially implemented to perform on-line cloud point estimations. The results have showed that the proposed kinetic model was in agreement with observed data and suitable to simulate the industrial operation. Pressure, temperature, and liquid hourly space velocity (LHSV) were found to be the main process variables controlling the conversion in the hydrodewaxing mechanism. The proposed sensor showed to be suitable to study the reactor performance for different set of operating conditions. Also, the surrogate model drastically reduced the computing time to obtain the cloud point estimations and showed to be suitable for on-line prediction purposes. Finally, the employed sequential design strategy revealed that nonlinearities can strongly affect the sensor accuracy if not properly handled

    A comparative study between the cubic spline and b-spline interpolation methods in free energy calculations

    Get PDF
    Numerical methods are essential in computational science, as analytic calculations for large datasets are impractical. Using numerical methods, one can approximate the problem to solve it with basic arithmetic operations. Interpolation is a commonly-used method, inter alia, constructing the value of new data points within an interval of known data points. Furthermore, polynomial interpolation with a sufficiently high degree can make the data set differentiable. One consequence of using high-degree polynomials is the oscillatory behaviour towards the endpoints, also known as Runge's Phenomenon. Spline interpolation overcomes this obstacle by connecting the data points in a piecewise fashion. However, its complex formulation requires nested iterations in higher dimensions, which is time-consuming. In addition, the calculations have to be repeated for computing each partial derivative at the data point, leading to further slowdown. The B-spline interpolation is an alternative representation of the cubic spline method, where a spline interpolation at a point could be expressed as the linear combination of piecewise basis functions. It was proposed that implementing this new formulation can accelerate many scientific computing operations involving interpolation. Nevertheless, there is a lack of detailed comparison to back up this hypothesis, especially when it comes to computing the partial derivatives. Among many scientific research fields, free energy calculations particularly stand out for their use of interpolation methods. Numerical interpolation was implemented in free energy methods for many purposes, from calculating intermediate energy states to deriving forces from free energy surfaces. The results of these calculations can provide insight into reaction mechanisms and their thermodynamic properties. The free energy methods include biased flat histogram methods, which are especially promising due to their ability to accurately construct free energy profiles at the rarely-visited regions of reaction spaces. Free Energies from Adaptive Reaction Coordinates (FEARCF) that was developed by Professor Kevin J. Naidoo has many advantages over the other flat histogram methods. iii Because of its treatment of the atoms in reactions, FEARCF makes it easier to apply interpolation methods. It implements cubic spline interpolation to derive biasing forces from the free energy surface, driving the reaction towards regions with higher energy. A major drawback of the method is the slowdown experienced in higher dimensions due to the complicated nature of the cubic spline routine. If the routine is replaced by a more straightforward B-spline interpolation, sampling and generating free energy surfaces can be accelerated. The dissertation aims to perform a comparative study between the cubic spline interpolation and B-spline interpolation methods. At first, data sets of analytic functions were used instead of numerical data to compare the accuracy and compute the percentage errors of both methods by taking the functions themselves as reference. These functions were used to evaluate the performances of the two methods at the endpoints, inflections points and regions with a steep gradient. Both interpolation methods generated identically approximated values with a percentage error below the threshold of 1%, although they both performed poorly at the endpoints and the points of inflection. Increasing the number of interpolation knots reduced the errors, however, it caused overfitting in the other regions. Although significant speed-up was not observed in the univariate interpolation, cubic spline suffered from a drastic slowdown in higher dimensions with up to 103 in 3D and 105 in 4D interpolations. The same results applied to the classical molecular dynamics simulations with FEARCF with a speed-up of up to 103 when B-spline interpolation was implemented. To conclude, the B-spline interpolation method can enhance the efficiency of the free energy calculations where cubic spline interpolation has been the currently-used method

    Artificial Wet Neuronal Networks from Compartmentalised Excitable Chemical Media (NEUNEU)

    Get PDF
    This document is a guide to the results of the NEUNEU research program, which is concerned with the development of mass- producible chemical information processing components and their interconnection into functional architectures.This document is a guide to the results of the NEUNEU research program, which is concerned with the development of mass- producible chemical information processing components and their interconnection into functional architectures
    • …
    corecore