13 research outputs found
Recommended from our members
Performance and Cost Assessment of Machine Learning Interatomic Potentials.
Machine learning of the quantitative relationship between local environment descriptors and the potential energy surface of a system of atoms has emerged as a new frontier in the development of interatomic potentials (IAPs). Here, we present a comprehensive evaluation of machine learning IAPs (ML-IAPs) based on four local environment descriptors-atom-centered symmetry functions (ACSF), smooth overlap of atomic positions (SOAP), the spectral neighbor analysis potential (SNAP) bispectrum components, and moment tensors-using a diverse data set generated using high-throughput density functional theory (DFT) calculations. The data set comprising bcc (Li, Mo) and fcc (Cu, Ni) metals and diamond group IV semiconductors (Si, Ge) is chosen to span a range of crystal structures and bonding. All descriptors studied show excellent performance in predicting energies and forces far surpassing that of classical IAPs, as well as predicting properties such as elastic constants and phonon dispersion curves. We observe a general trade-off between accuracy and the degrees of freedom of each model and, consequently, computational cost. We will discuss these trade-offs in the context of model selection for molecular dynamics and other applications
Building nonparametric -body force fields using Gaussian process regression
Constructing a classical potential suited to simulate a given atomic system
is a remarkably difficult task. This chapter presents a framework under which
this problem can be tackled, based on the Bayesian construction of
nonparametric force fields of a given order using Gaussian process (GP) priors.
The formalism of GP regression is first reviewed, particularly in relation to
its application in learning local atomic energies and forces. For accurate
regression it is fundamental to incorporate prior knowledge into the GP kernel
function. To this end, this chapter details how properties of smoothness,
invariance and interaction order of a force field can be encoded into
corresponding kernel properties. A range of kernels is then proposed,
possessing all the required properties and an adjustable parameter
governing the interaction order modelled. The order best suited to describe
a given system can be found automatically within the Bayesian framework by
maximisation of the marginal likelihood. The procedure is first tested on a toy
model of known interaction and later applied to two real materials described at
the DFT level of accuracy. The models automatically selected for the two
materials were found to be in agreement with physical intuition. More in
general, it was found that lower order (simpler) models should be chosen when
the data are not sufficient to resolve more complex interactions. Low GPs
can be further sped up by orders of magnitude by constructing the corresponding
tabulated force field, here named "MFF".Comment: 31 pages, 11 figures, book chapte
Machine-learned interatomic potentials for alloys and alloy phase diagrams
We introduce machine-learned potentials for Ag-Pd to describe the energy of alloy configurations over a wide range of compositions. We compare two different approaches. Moment tensor potentials (MTPs) are polynomial-like functions of interatomic distances and angles. The Gaussian approximation potential (GAP) framework uses kernel regression, and we use the smooth overlap of atomic position (SOAP) representation of atomic neighborhoods that consist of a complete set of rotational and permutational invariants provided by the power spectrum of the spherical Fourier transform of the neighbor density. Both types of potentials give excellent accuracy for a wide range of compositions, competitive with the accuracy of cluster expansion, a benchmark for this system. While both models are able to describe small deformations away from the lattice positions, SOAP-GAP excels at transferability as shown by sensible transformation paths between configurations, and MTP allows, due to its lower computational cost, the calculation of compositional phase diagrams. Given the fact that both methods perform nearly as well as cluster expansion but yield off-lattice models, we expect them to open new avenues in computational materials modeling for alloys
Machine-learned multi-system surrogate models for materials prediction
AbstractSurrogate machine-learning models are transforming computational materials science by predicting properties of materials with the accuracy of ab initio methods at a fraction of the computational cost. We demonstrate surrogate models that simultaneously interpolate energies of different materials on a dataset of 10 binary alloys (AgCu, AlFe, AlMg, AlNi, AlTi, CoNi, CuFe, CuNi, FeV, and NbNi) with 10 different species and all possible fcc, bcc, and hcp structures up to eight atoms in the unit cell, 15,950 structures in total. We find that the deviation of prediction errors when increasing the number of simultaneously modeled alloys is <1鈥塵eV/atom. Several state-of-the-art materials representations and learning algorithms were found to qualitatively agree on the prediction errors of formation enthalpy with relative errors of <2.5% for all systems.</jats:p
Machine-learned multi-system surrogate models for materials prediction
Surrogate machine-learning models are transforming computational materials science by predicting properties of materials with the accuracy of ab initio methods at a fraction of the computational cost. We demonstrate surrogate models that simultaneously interpolate energies of different materials on a dataset of 10 binary alloys (AgCu, AlFe, AlMg, AlNi, AlTi, CoNi, CuFe, CuNi, FeV, NbNi) with 10 different species and all possible fcc, bcc and hcp structures up to 8 atoms in the unit cell, 15\,950 structures in total. We find that the deviation of prediction errors when increasing the number of simultaneously modeled alloys is less than 1\,meV/atom. Several state-of-the-art materials representations and learning algorithms were found to qualitatively agree on the prediction errors of formation enthalpy with relative errors of 2.5\% for all systems