63,711 research outputs found
Atomic-scale representation and statistical learning of tensorial properties
This chapter discusses the importance of incorporating three-dimensional
symmetries in the context of statistical learning models geared towards the
interpolation of the tensorial properties of atomic-scale structures. We focus
on Gaussian process regression, and in particular on the construction of
structural representations, and the associated kernel functions, that are
endowed with the geometric covariance properties compatible with those of the
learning targets. We summarize the general formulation of such a
symmetry-adapted Gaussian process regression model, and how it can be
implemented based on a scheme that generalizes the popular smooth overlap of
atomic positions representation. We give examples of the performance of this
framework when learning the polarizability and the ground-state electron
density of a molecule
Recursive evaluation and iterative contraction of -body equivariant features
Mapping an atomistic configuration to an -point correlation of a field
associated with the atomic positions (e.g. an atomic density) has emerged as an
elegant and effective solution to represent structures as the input of
machine-learning algorithms. While it has become clear that low-order density
correlations do not provide a complete representation of an atomic environment,
the exponential increase in the number of possible -body invariants makes it
difficult to design a concise and effective representation. We discuss how to
exploit recursion relations between equivariant features of different orders
(generalizations of -body invariants that provide a complete representation
of the symmetries of improper rotations) to compute high-order terms
efficiently. In combination with the automatic selection of the most expressive
combination of features at each order, this approach provides a conceptual and
practical framework to generate systematically-improvable, symmetry adapted
representations for atomistic machine learning
Symmetry Regularization
The properties of a representation, such as smoothness, adaptability, generality, equivari- ance/invariance, depend on restrictions imposed during learning. In this paper, we propose using data symmetries, in the sense of equivalences under transformations, as a means for learning symmetry- adapted representations, i.e., representations that are equivariant to transformations in the original space. We provide a sufficient condition to enforce the representation, for example the weights of a neural network layer or the atoms of a dictionary, to have a group structure and specifically the group structure in an unlabeled training set. By reducing the analysis of generic group symmetries to per- mutation symmetries, we devise an analytic expression for a regularization scheme and a permutation invariant metric on the representation space. Our work provides a proof of concept on why and how to learn equivariant representations, without explicit knowledge of the underlying symmetries in the data.This material is based upon work supported by the Center for Brains, Minds and Machines (CBMM), funded by NSF STC award CCF-1231216
A Transferable Machine-Learning Model of the Electron Density
The electronic charge density plays a central role in determining the
behavior of matter at the atomic scale, but its computational evaluation
requires demanding electronic-structure calculations. We introduce an
atom-centered, symmetry-adapted framework to machine-learn the valence charge
density based on a small number of reference calculations. The model is highly
transferable, meaning it can be trained on electronic-structure data of small
molecules and used to predict the charge density of larger compounds with low,
linear-scaling cost. Applications are shown for various hydrocarbon molecules
of increasing complexity and flexibility, and demonstrate the accuracy of the
model when predicting the density on octane and octatetraene after training
exclusively on butane and butadiene. This transferable, data-driven model can
be used to interpret experiments, initialize electronic structure calculations,
and compute electrostatic interactions in molecules and condensed-phase
systems
Invariance of visual operations at the level of receptive fields
Receptive field profiles registered by cell recordings have shown that
mammalian vision has developed receptive fields tuned to different sizes and
orientations in the image domain as well as to different image velocities in
space-time. This article presents a theoretical model by which families of
idealized receptive field profiles can be derived mathematically from a small
set of basic assumptions that correspond to structural properties of the
environment. The article also presents a theory for how basic invariance
properties to variations in scale, viewing direction and relative motion can be
obtained from the output of such receptive fields, using complementary
selection mechanisms that operate over the output of families of receptive
fields tuned to different parameters. Thereby, the theory shows how basic
invariance properties of a visual system can be obtained already at the level
of receptive fields, and we can explain the different shapes of receptive field
profiles found in biological vision from a requirement that the visual system
should be invariant to the natural types of image transformations that occur in
its environment.Comment: 40 pages, 17 figure
Machine-learning of atomic-scale properties based on physical principles
We briefly summarize the kernel regression approach, as used recently in
materials modelling, to fitting functions, particularly potential energy
surfaces, and highlight how the linear algebra framework can be used to both
predict and train from linear functionals of the potential energy, such as the
total energy and atomic forces. We then give a detailed account of the Smooth
Overlap of Atomic Positions (SOAP) representation and kernel, showing how it
arises from an abstract representation of smooth atomic densities, and how it
is related to several popular density-based representations of atomic
structure. We also discuss recent generalisations that allow fine control of
correlations between different atomic species, prediction and fitting of
tensorial properties, and also how to construct structural kernels---applicable
to comparing entire molecules or periodic systems---that go beyond an additive
combination of local environments
- …