1,757 research outputs found

    Inductive biases for graph neural networks

    Get PDF
    Graph structured representations are a powerful inductive bias applicable across a wide spectrum of systems in nature, ranging from atom interactions in molecular systems to complex human interactions such as social networks. Part of the success of Graph Neural Networks (GNNs) can be attributed to their broad applicability in capturing these complex interactions. This thesis aims to extend the capabilities of GNNs by incorporating additional physics-based inductive biases.The thesis begins by enriching GNN architectures with traditional graphical inference methods to craft hybrid models. These models leverage the prior knowledge inherent in conventional graphical models along with the adaptive inference from data-driven learning. The resulting algorithms outperform the individual approaches when run in isolation.We then implement inductive biases as a symmetry constraint by creating E(n) Equivariant Graph Neural Networks (EGNNs), which improve upon standard GNNs through the inclusion of the Euclidean symmetry. This improves generalization for data within n-dimensional Euclidean spaces, a characteristic particularly relevant in molecular data. Subsequently, we demonstrate the benefits of EGNNs in various applications in the domain of deep learning for molecular modelling.The concluding part of this work is the incorporation of euclidean symmetries into generative models by building upon the proposed EGNNs. The presented generative model significantly outperforms previous 3D molecular generative models, showing potential to be disruptive in the future of molecular modeling

    Inductive biases for graph neural networks

    Get PDF
    Graph structured representations are a powerful inductive bias applicable across a wide spectrum of systems in nature, ranging from atom interactions in molecular systems to complex human interactions such as social networks. Part of the success of Graph Neural Networks (GNNs) can be attributed to their broad applicability in capturing these complex interactions. This thesis aims to extend the capabilities of GNNs by incorporating additional physics-based inductive biases.The thesis begins by enriching GNN architectures with traditional graphical inference methods to craft hybrid models. These models leverage the prior knowledge inherent in conventional graphical models along with the adaptive inference from data-driven learning. The resulting algorithms outperform the individual approaches when run in isolation.We then implement inductive biases as a symmetry constraint by creating E(n) Equivariant Graph Neural Networks (EGNNs), which improve upon standard GNNs through the inclusion of the Euclidean symmetry. This improves generalization for data within n-dimensional Euclidean spaces, a characteristic particularly relevant in molecular data. Subsequently, we demonstrate the benefits of EGNNs in various applications in the domain of deep learning for molecular modelling.The concluding part of this work is the incorporation of euclidean symmetries into generative models by building upon the proposed EGNNs. The presented generative model significantly outperforms previous 3D molecular generative models, showing potential to be disruptive in the future of molecular modeling

    Advances of Machine Learning in Materials Science: Ideas and Techniques

    Full text link
    In this big data era, the use of large dataset in conjunction with machine learning (ML) has been increasingly popular in both industry and academia. In recent times, the field of materials science is also undergoing a big data revolution, with large database and repositories appearing everywhere. Traditionally, materials science is a trial-and-error field, in both the computational and experimental departments. With the advent of machine learning-based techniques, there has been a paradigm shift: materials can now be screened quickly using ML models and even generated based on materials with similar properties; ML has also quietly infiltrated many sub-disciplinary under materials science. However, ML remains relatively new to the field and is expanding its wing quickly. There are a plethora of readily-available big data architectures and abundance of ML models and software; The call to integrate all these elements in a comprehensive research procedure is becoming an important direction of material science research. In this review, we attempt to provide an introduction and reference of ML to materials scientists, covering as much as possible the commonly used methods and applications, and discussing the future possibilities.Comment: 80 pages; 22 figures. To be published in Frontiers of Physics, 18, xxxxx, (2023
    • …
    corecore