20 research outputs found

    Linear graph convolutional networks

    Get PDF
    Many neural networks for graphs are based on the graph convolution operator, proposed more than a decade ago. Since then, many alternative definitions have been proposed, that tend to add complexity (and non-linearity) to the model. In this paper, we follow the opposite direction by proposing a linear graph convolution operator. Despite its simplicity, we show that our convolution operator is more theoretically grounded than many proposals in literature, and shows improved predictive performance

    Towards interactive betweenness centrality estimation for transportation network using capsule network

    Get PDF
    Includes bibliographical references.2022 Fall.The node importance of a graph needs to be estimated for many graph-based applications. One of the most popular metrics for measuring node importance is betweenness centrality, which measures the amount of influence a node has over the flow of information in a graph. However, the computation complexity of calculating betweenness centrality is extremely high with large- scale graphs. This is especially true when analyzing the road networks of states with millions of nodes and edges, making it infeasible to calculate their betweenness centrality (BC) in real- time using traditional iterative methods. The application of a machine learning model to predict the importance of nodes provides opportunities to address this issue. Graph Neural Networks (GNNs), which have been gaining popularity in recent years, are particularly well-suited for graph analysis. In this study, we propose a deep learning architecture RoadCaps to estimate the BC by merging Capsule Neural Networks with Graph Convolutional Networks (GCN), a convolution operation based GNN. We target the effective aggregation of features from neighbor nodes to approximate the correct BC of a node. We leverage patterns capturing the strength of the capsule network to effectively estimate the node level BC from the high-level information generated by the GCN block. We further compare the model accuracy and effectiveness of RoadCaps with the other two GCN-based models. We also analyze the efficiency and effectiveness of RoadCaps for different aspects like scalability and robustness. We perform one empirical benchmark with the road network for the entire state of California. The overall analysis shows that our proposed network can provide more accurate road importance estimation, which is helpful for rapid response planning such as evacuation during wildfires and flooding

    A Comprehensive Survey on Graph Neural Networks

    Full text link
    Deep learning has revolutionized many machine learning tasks in recent years, ranging from image classification and video processing to speech recognition and natural language understanding. The data in these tasks are typically represented in the Euclidean space. However, there is an increasing number of applications, where data are generated from non-Euclidean domains and are represented as graphs with complex relationships and interdependency between objects. The complexity of graph data has imposed significant challenges on the existing machine learning algorithms. Recently, many studies on extending deep learning approaches for graph data have emerged. In this article, we provide a comprehensive overview of graph neural networks (GNNs) in data mining and machine learning fields. We propose a new taxonomy to divide the state-of-the-art GNNs into four categories, namely, recurrent GNNs, convolutional GNNs, graph autoencoders, and spatial-temporal GNNs. We further discuss the applications of GNNs across various domains and summarize the open-source codes, benchmark data sets, and model evaluation of GNNs. Finally, we propose potential research directions in this rapidly growing field

    Scalable Nearest Neighbor Search with Compact Codes

    Get PDF
    An important characteristic of the recent decade is the dramatic growth in the use and generation of data. From collections of images, documents and videos, to genetic data, and to network traffic statistics, modern technologies and cheap storage have made it possible to accumulate huge datasets. But how can we effectively use all this data? The growing sizes of the modern datasets make it crucial to develop new algorithms and tools capable of sifting through this data efficiently. A central computational primitive for analyzing large datasets is the Nearest Neighbor Search problem in which the goal is to preprocess a set of objects, so that later, given a query object, one can find the data object closest to the query. In most situations involving high-dimensional objects, the exhaustive search which compares the query with every item in the dataset has a prohibitive cost both for runtime and memory space. This thesis focuses on the design of algorithms and tools for fast and cost efficient nearest neighbor search. The proposed techniques advocate the use of compressed and discrete codes for representing the neighborhood structure of data in a compact way. Transforming high-dimensional items, such as raw images, into similarity-preserving compact codes has both computational and storage advantages as compact codes can be stored efficiently using only a few bits per data item, and more importantly they can be compared extremely fast using bit-wise or look-up table operators. Motivated by this view, the present work explores two main research directions: 1) finding mappings that better preserve the given notion of similarity while keeping the codes as compressed as possible, and 2) building efficient data structures that support non-exhaustive search among the compact codes. Our large-scale experimental results reported on various benchmarks including datasets upto one billion items, show boost in retrieval performance in comparison to the state-of-the-art

    16th SC@RUG 2019 proceedings 2018-2019

    Get PDF

    16th SC@RUG 2019 proceedings 2018-2019

    Get PDF

    16th SC@RUG 2019 proceedings 2018-2019

    Get PDF

    16th SC@RUG 2019 proceedings 2018-2019

    Get PDF

    16th SC@RUG 2019 proceedings 2018-2019

    Get PDF
    corecore