1,376 research outputs found

    Understanding and Enhancing Graph Neural Networks From the Perspective of Partial Differential Equations

    Get PDF
    We understand graph neural networks from the perspective of partial diff erential equations. Firstly, based on the relationship between the partial diff erential equation and the propagation equation of graph neural networks, the topology and node features are treated as independent variables of the wave function to better combine the topological structure information of the graph with the node feature information. Secondly, the theoretical framework of the graph neural network model PGNN is established by the variable separation operation of the partial diff erential equation, which makes some existing models have diff erent degrees of PGNN approximation. Finally, experiments show that the model in this paper achieves good results on commonly used citation datasets

    Neural Message Passing with Edge Updates for Predicting Properties of Molecules and Materials

    Get PDF
    Neural message passing on molecular graphs is one of the most promising methods for predicting formation energy and other properties of molecules and materials. In this work we extend the neural message passing model with an edge update network which allows the information exchanged between atoms to depend on the hidden state of the receiving atom. We benchmark the proposed model on three publicly available datasets (QM9, The Materials Project and OQMD) and show that the proposed model yields superior prediction of formation energies and other properties on all three datasets in comparison with the best published results. Furthermore we investigate different methods for constructing the graph used to represent crystalline structures and we find that using a graph based on K-nearest neighbors achieves better prediction accuracy than using maximum distance cutoff or the Voronoi tessellation graph

    Functional division of the dorsal striatum based on a graph neural network

    Get PDF
    The dorsal striatum, an essential nucleus in subcortical areas, has a crucial role in controlling a variety of complex cognitive behaviors; however, few studies have been conducted in recent years to explore the functional subregions of the dorsal striatum that are significantly activated when performing multiple tasks. To explore the differences and connections between the functional subregions of the dorsal striatum that are significantly activated when performing different tasks, we propose a framework for functional division of the dorsal striatum based on a graph neural network model. First, time series information for each voxel in the dorsal striatum is extracted from acquired functional magnetic resonance imaging data and used to calculate the connection strength between voxels. Then, a graph is constructed using the voxels as nodes and the connection strengths between voxels as edges. Finally, the graph data are analyzed using the graph neural network model to functionally divide the dorsal striatum. The framework was used to divide functional subregions related to the four tasks including olfactory reward, "0-back" working memory, emotional picture stimulation, and capital investment decision-making. The results were further subjected to conjunction analysis to obtain 15 functional subregions in the dorsal striatum. The 15 different functional subregions divided based on the graph neural network model indicate that there is functional differentiation in the dorsal striatum when the brain performs different cognitive tasks. The spatial localization of the functional subregions contributes to a clear understanding of the differences and connections between functional subregions

    Graph Neural Networks Meet Neural-Symbolic Computing: A Survey and Perspective

    Full text link
    Neural-symbolic computing has now become the subject of interest of both academic and industry research laboratories. Graph Neural Networks (GNN) have been widely used in relational and symbolic domains, with widespread application of GNNs in combinatorial optimization, constraint satisfaction, relational reasoning and other scientific domains. The need for improved explainability, interpretability and trust of AI systems in general demands principled methodologies, as suggested by neural-symbolic computing. In this paper, we review the state-of-the-art on the use of GNNs as a model of neural-symbolic computing. This includes the application of GNNs in several domains as well as its relationship to current developments in neural-symbolic computing.Comment: Updated version, draft of accepted IJCAI2020 Survey Pape

    Novel deep learning methods for track reconstruction

    Full text link
    For the past year, the HEP.TrkX project has been investigating machine learning solutions to LHC particle track reconstruction problems. A variety of models were studied that drew inspiration from computer vision applications and operated on an image-like representation of tracking detector data. While these approaches have shown some promise, image-based methods face challenges in scaling up to realistic HL-LHC data due to high dimensionality and sparsity. In contrast, models that can operate on the spacepoint representation of track measurements ("hits") can exploit the structure of the data to solve tasks efficiently. In this paper we will show two sets of new deep learning models for reconstructing tracks using space-point data arranged as sequences or connected graphs. In the first set of models, Recurrent Neural Networks (RNNs) are used to extrapolate, build, and evaluate track candidates akin to Kalman Filter algorithms. Such models can express their own uncertainty when trained with an appropriate likelihood loss function. The second set of models use Graph Neural Networks (GNNs) for the tasks of hit classification and segment classification. These models read a graph of connected hits and compute features on the nodes and edges. They adaptively learn which hit connections are important and which are spurious. The models are scaleable with simple architecture and relatively few parameters. Results for all models will be presented on ACTS generic detector simulated data.Comment: CTD 2018 proceeding

    Graph Neural Networks for Particle Reconstruction in High Energy Physics detectors

    Full text link
    Pattern recognition problems in high energy physics are notably different from traditional machine learning applications in computer vision. Reconstruction algorithms identify and measure the kinematic properties of particles produced in high energy collisions and recorded with complex detector systems. Two critical applications are the reconstruction of charged particle trajectories in tracking detectors and the reconstruction of particle showers in calorimeters. These two problems have unique challenges and characteristics, but both have high dimensionality, high degree of sparsity, and complex geometric layouts. Graph Neural Networks (GNNs) are a relatively new class of deep learning architectures which can deal with such data effectively, allowing scientists to incorporate domain knowledge in a graph structure and learn powerful representations leveraging that structure to identify patterns of interest. In this work we demonstrate the applicability of GNNs to these two diverse particle reconstruction problems.Comment: Presented at NeurIPS 2019 Workshop "Machine Learning and the Physical Sciences
    • …
    corecore