21 research outputs found

    A Magnetic Framelet-Based Convolutional Neural Network for Directed Graphs

    Get PDF
    Recent years have witnessed the surging popularity among studies on directed graphs (digraphs) and digraph neural networks. With the unique capability of encoding directional relationships, digraphs have shown their superiority in modelling many real-life applications, such as citation analysis and website hyperlinks. Spectral Graph Convolutional Neural Networks (spectral GCNNs), a powerful tool for processing and analyzing undirected graph data, have been recently introduced to digraphs. Although spectral GCNNs typically apply frequency filtering via Fourier transform to obtain representations with selective information, research shows that model performance can be enhanced by framelet transform-based filtering. However, the massive majority of such research only considers spectral GCNNs for undirected graphs. In this thesis, we introduce Framelet-MagNet, a magnetic framelet-based spectral GCNN for digraphs. The model adopts magnetic framelet transform which decomposes the input digraph data to low-pass and high-pass frequency components in the spectral domain, forming a more sophisticated digraph representation for filtering. Digraph framelets are constructed with the complex-valued magnetic Laplacian, simultaneously leading to signal processing in both real and complex domains. To our best knowledge, this approach is the first attempt to conduct framelet-based convolution on digraph data in both real and complex domains. We empirically validate the predictive power of Framelet-MagNet via various tasks, including node classification, link prediction, and denoising. Besides, we show through experiment results that Framelet-MagNet can outperform the state-of-the-art approaches across several benchmark datasets

    Bregman Graph Neural Network

    Full text link
    Numerous recent research on graph neural networks (GNNs) has focused on formulating GNN architectures as an optimization problem with the smoothness assumption. However, in node classification tasks, the smoothing effect induced by GNNs tends to assimilate representations and over-homogenize labels of connected nodes, leading to adverse effects such as over-smoothing and misclassification. In this paper, we propose a novel bilevel optimization framework for GNNs inspired by the notion of Bregman distance. We demonstrate that the GNN layer proposed accordingly can effectively mitigate the over-smoothing issue by introducing a mechanism reminiscent of the "skip connection". We validate our theoretical results through comprehensive empirical studies in which Bregman-enhanced GNNs outperform their original counterparts in both homophilic and heterophilic graphs. Furthermore, our experiments also show that Bregman GNNs can produce more robust learning accuracy even when the number of layers is high, suggesting the effectiveness of the proposed method in alleviating the over-smoothing issue

    From Continuous Dynamics to Graph Neural Networks: Neural Diffusion and Beyond

    Full text link
    Graph neural networks (GNNs) have demonstrated significant promise in modelling relational data and have been widely applied in various fields of interest. The key mechanism behind GNNs is the so-called message passing where information is being iteratively aggregated to central nodes from their neighbourhood. Such a scheme has been found to be intrinsically linked to a physical process known as heat diffusion, where the propagation of GNNs naturally corresponds to the evolution of heat density. Analogizing the process of message passing to the heat dynamics allows to fundamentally understand the power and pitfalls of GNNs and consequently informs better model design. Recently, there emerges a plethora of works that proposes GNNs inspired from the continuous dynamics formulation, in an attempt to mitigate the known limitations of GNNs, such as oversmoothing and oversquashing. In this survey, we provide the first systematic and comprehensive review of studies that leverage the continuous perspective of GNNs. To this end, we introduce foundational ingredients for adapting continuous dynamics to GNNs, along with a general framework for the design of graph neural dynamics. We then review and categorize existing works based on their driven mechanisms and underlying dynamics. We also summarize how the limitations of classic GNNs can be addressed under the continuous framework. We conclude by identifying multiple open research directions

    Exposition on over-squashing problem on GNNs: Current Methods, Benchmarks and Challenges

    Full text link
    Graph-based message-passing neural networks (MPNNs) have achieved remarkable success in both node and graph-level learning tasks. However, several identified problems, including over-smoothing (OSM), limited expressive power, and over-squashing (OSQ), still limit the performance of MPNNs. In particular, OSQ serves as the latest identified problem, where MPNNs gradually lose their learning accuracy when long-range dependencies between graph nodes are required. In this work, we provide an exposition on the OSQ problem by summarizing different formulations of OSQ from current literature, as well as the three different categories of approaches for addressing the OSQ problem. In addition, we also discuss the alignment between OSQ and expressive power and the trade-off between OSQ and OSM. Furthermore, we summarize the empirical methods leveraged from existing works to verify the efficiency of OSQ mitigation approaches, with illustrations of their computational complexities. Lastly, we list some open questions that are of interest for further exploration of the OSQ problem along with potential directions from the best of our knowledge

    The art of face-saving and culture-changing: sculpting Chinese football’s past, present and future

    Get PDF
    In this paper, we consider the football statues of China, whose football team has dramatically underperformed relative to its population size and economic power. Although China lacks a participative grassroots football culture and has struggled to establish a credible domestic league, recent government intervention and investment has seen football’s profile rise dramatically. China’s many football statues are largely atypical in comparison to the rest of the world, including their depiction of anonymous figures rather than national or local heroes, the incorporation of tackling scenes in their designs, and their location at training camps. Through four specific examples and reference to a global database, we illustrate how these statues reflect the tensions and difficulties inherent in China’s desire to integrate itself into global football, and achieve its stated goal of hosting and winning the FIFA World Cup, whilst simultaneously upholding national, cultural and political values such as the primacy of hard work and learning, and saving face in defeat

    A Simple Yet Effective SVD-GCN for Directed Graphs

    Full text link
    In this paper, we propose a simple yet effective graph neural network for directed graphs (digraph) based on the classic Singular Value Decomposition (SVD), named SVD-GCN. The new graph neural network is built upon the graph SVD-framelet to better decompose graph signals on the SVD ``frequency'' bands. Further the new framelet SVD-GCN is also scaled up for larger scale graphs via using Chebyshev polynomial approximation. Through empirical experiments conducted on several node classification datasets, we have found that SVD-GCN has remarkable improvements in a variety of graph node learning tasks and it outperforms GCN and many other state-of-the-art graph neural networks for digraphs. Moreover, we empirically demonstate that the SVD-GCN has great denoising capability and robustness to high level graph data attacks. The theoretical and experimental results prove that the SVD-GCN is effective on a variant of graph datasets, meanwhile maintaining stable and even better performance than the state-of-the-arts.Comment: 14 page
    corecore