244,183 research outputs found
Autonomy and Automation. Computational modeling, reduction, and explanation in quantum chemistry
This paper discusses how computational modeling combines the autonomy of models with the automation of computational procedures. In particular, the case of ab initio methods in quantum chemistry will be investigated to draw two lessons from the analysis of computational modeling. The first belongs to general philosophy of science: Computational modeling faces a trade-off and enlarges predictive force at the cost of explanatory force. The other lesson is about the philosophy of chemistry: The methodology of computational modeling puts into doubt claims about the reduction of chemistry to physics
Computational structure‐based drug design: Predicting target flexibility
The role of molecular modeling in drug design has experienced a significant revamp in the last decade. The increase in computational resources and molecular models, along with software developments, is finally introducing a competitive advantage in early phases of drug discovery. Medium and small companies with strong focus on computational chemistry are being created, some of them having introduced important leads in drug design pipelines. An important source for this success is the extraordinary development of faster and more efficient techniques for describing flexibility in three‐dimensional structural molecular modeling. At different levels, from docking techniques to atomistic molecular dynamics, conformational sampling between receptor and drug results in improved predictions, such as screening enrichment, discovery of transient cavities, etc. In this review article we perform an extensive analysis of these modeling techniques, dividing them into high and low throughput, and emphasizing in their application to drug design studies. We finalize the review with a section describing our Monte Carlo method, PELE, recently highlighted as an outstanding advance in an international blind competition and industrial benchmarks.We acknowledge the BSC-CRG-IRB Joint Research Program in Computational Biology. This work was supported by a grant
from the Spanish Government CTQ2016-79138-R.J.I. acknowledges support from SVP-2014-068797, awarded by the Spanish Government.Peer ReviewedPostprint (author's final draft
Computerized reduction of elementary reaction sets for CFD combustion modeling
Modeling of chemistry in Computational Fluid Dynamics can be the most time-consuming aspect of many applications. If the entire set of elementary reactions is to be solved, a set of stiff ordinary differential equations must be integrated. Some of the reactions take place at very high rates, requiring short time steps, while others take place more slowly and make little progress in the short time step integration
Building Predictive Models in R Using the caret Package
The caret package, short for classification and regression training, contains numerous tools for developing predictive models using the rich set of models available in R. The package focuses on simplifying model training and tuning across a wide variety of modeling techniques. It also includes methods for pre-processing training data, calculating variable importance, and model visualizations. An example from computational chemistry is used to illustrate the functionality on a real data set and to benchmark the benefits of parallel processing with several types of models.
Molecular dynamics recipes for genome research
Molecular dynamics (MD) simulation allows one to predict the time evolution of a system of interacting particles. It is widely used in physics, chemistry and biology to address specific questions about the structural properties and dynamical mechanisms of model systems. MD earned a great success in genome research, as it proved to be beneficial in sorting pathogenic from neutral genomic mutations. Considering their computational requirements, simulations are commonly performed on HPC computing devices, which are generally expensive and hard to administer. However, variables like the software tool used for modeling and simulation or the size of the molecule under investigation might make one hardware type or configuration more advantageous than another or even make the commodity hardware definitely suitable for MD studies. This work aims to shed lights on this aspect
Issues and approach to develop validated analysis tools for hypersonic flows: One perspective
Critical issues concerning the modeling of low-density hypervelocity flows where thermochemical nonequilibrium effects are pronounced are discussed. Emphasis is on the development of validated analysis tools. A description of the activity in the Ames Research Center's Aerothermodynamics Branch is also given. Inherent in the process is a strong synergism between ground test and real-gas computational fluid dynamics (CFD). Approaches to develop and/or enhance phenomenological models and incorporate them into computational flow-field simulation codes are discussed. These models have been partially validated with experimental data for flows where the gas temperature is raised (compressive flows). Expanding flows, where temperatures drop, however, exhibit somewhat different behavior. Experimental data for these expanding flow conditions are sparse; reliance must be made on intuition and guidance from computational chemistry to model transport processes under these conditions. Ground-based experimental studies used to provide necessary data for model development and validation are described. Included are the performance characteristics of high-enthalpy flow facilities, such as shock tubes and ballistic ranges
Multi-scale hydration modeling of calcium sulphates
Computer models for cement hydration has been proven to be a useful tool for\ud
understanding the chemistry of cement hydration, simulating the microstructure\ud
development of hydrating paste and predicting the properties of the hydration process /1/.\ud
One of these advanced models is CEMHYD3D, which is used and extended within the\ud
University of Twente for the last 12 years with pore water chemistry /2/, slag cement /3/\ud
and multi-time modeling /4/. Chen and Brouwers /5/ pointed out that the smallest size\ud
handled in CEMHYD3D, called the ‘system resolution’ is important for a digitized model.\ud
Features smaller than the voxel sizes cannot be represented since the model works based on\ud
the movement and phase change of each discrete voxel. Furthermore, the system resolution\ud
determines the amount of computing time needed for a specific task, a higher system\ud
resolution will lead to longer computational time. Due to better computational possibilities,\ud
the use of higher resolutions is possible nowadays.\ud
This article shows the effects of using different resolutions with CEMHYD3D. This is done\ud
for the ‘fresh’ mixtures as well as during hydration modeling of the binder. The model has\ud
been modified to cope with several different resolutions from 0.20-2 μm (or 500-50 voxels\ud
in the system in a box of 100 μm x 100 μm x 100 μm). This paper shows two methods for\ud
the multi-scale modeling. The first method consists of a system, which use a modified\ud
PSD-line for each resolution. The second method uses the same digitized initial\ud
microstructure, but in stead of 1 voxel of 1 x 1 x 1 μm3 for 200 μm-system 8 voxels of 0.5\ud
x 0.5 x 0.5 μm3 are used and for the 300-μm system 27 voxels of 0.33 x 0.33 x 0.33 μm3
- …
