12 research outputs found

    Characterizing water-metal interfaces and machine learning potential energy surfaces

    Get PDF
    In this thesis, we first discuss the fundamentals of ab initio electronic structure theory and density functional theory (DFT). We also discuss statistics related to computing thermodynamic averages of molecular dynamics (MD). We then use this theory to analyze and compare the structural, dynamical, and electronic properties of liquid water next to prototypical metals including platinum, graphite, and graphene. Our results are built on Born-Oppenheimer molecular dynamics (BOMD) generated using density functional theory (DFT) which explicitly include van der Waals (vdW) interactions within a first principles approach. All calculations reported use large simulation cells, allowing for an accurate treatment of the water-electrode interfaces. We have included vdW interactions through the use of the optB86b-vdW exchange correlation functional. Comparisons with the Perdew-Burke-Ernzerhof (PBE) exchange correlation functional are also shown. We find an initial peak, due to chemisorption, in the density profile of the liquid water-Pt interface not seen in the liquid water-graphite interface, liquid water-graphene interface, nor interfaces studied previously. To further investigate this chemisorption peak, we also report differences in the electronic structure of single water molecules on both Pt and graphite surfaces. We find that a covalent bond forms between the single water molecule and the platinum surface, but not between the single water molecule and the graphite surface. We also discuss the effects that defects and dopants in the graphite and graphene surfaces have on the structure and dynamics of liquid water. Lastly, we introduce artificial neural networks (ANNs), and demonstrate how they can be used to machine learn electronic structure calculations. As a proof of principle, we show the success of an ANN potential energy surfaces for a dimer molecule with a Lennard-Jones potential

    Machine Learning Diffusion Monte Carlo Energies

    Full text link
    We present two machine learning methodologies that are capable of predicting diffusion Monte Carlo (DMC) energies with small datasets (~60 DMC calculations in total). The first uses voxel deep neural networks (VDNNs) to predict DMC energy densities using Kohn-Sham density functional theory (DFT) electron densities as input. The second uses kernel ridge regression (KRR) to predict atomic contributions to the DMC total energy using atomic environment vectors as input (we used atom centred symmetry functions, atomic environment vectors from the ANI models, and smooth overlap of atomic positions). We first compare the methodologies on pristine graphene lattices, where we find the KRR methodology performs best in comparison to gradient boosted decision trees, random forest, gaussian process regression, and multilayer perceptrons. In addition, KRR outperforms VDNNs by an order of magnitude. Afterwards, we study the generalizability of KRR to predict the energy barrier associated with a Stone-Wales defect. Lastly, we move from 2D to 3D materials and use KRR to predict total energies of liquid water. In all cases, we find that the KRR models are more accurate than Kohn-Sham DFT and all mean absolute errors are less than chemical accuracy

    Twin Neural Network Regression

    Full text link
    We introduce twin neural network (TNN) regression. This method predicts differences between the target values of two different data points rather than the targets themselves. The solution of a traditional regression problem is then obtained by averaging over an ensemble of all predicted differences between the targets of an unseen data point and all training data points. Whereas ensembles are normally costly to produce, TNN regression intrinsically creates an ensemble of predictions of twice the size of the training set while only training a single neural network. Since ensembles have been shown to be more accurate than single models this property naturally transfers to TNN regression. We show that TNNs are able to compete or yield more accurate predictions for different data sets, compared to other state-of-the-art methods. Furthermore, TNN regression is constrained by self-consistency conditions. We find that the violation of these conditions provides an estimate for the prediction uncertainty

    Deep learning and density-functional theory

    No full text

    Structural characterization of water-metal interfaces

    No full text
    corecore