34,580 research outputs found

    A Convolutional Neural Network into graph space

    Full text link
    Convolutional neural networks (CNNs), in a few decades, have outperformed the existing state of the art methods in classification context. However, in the way they were formalised, CNNs are bound to operate on euclidean spaces. Indeed, convolution is a signal operation that are defined on euclidean spaces. This has restricted deep learning main use to euclidean-defined data such as sound or image. And yet, numerous computer application fields (among which network analysis, computational social science, chemo-informatics or computer graphics) induce non-euclideanly defined data such as graphs, networks or manifolds. In this paper we propose a new convolution neural network architecture, defined directly into graph space. Convolution and pooling operators are defined in graph domain. We show its usability in a back-propagation context. Experimental results show that our model performance is at state of the art level on simple tasks. It shows robustness with respect to graph domain changes and improvement with respect to other euclidean and non-euclidean convolutional architectures.Comment: arXiv admin note: text overlap with arXiv:1611.08402 by other author

    Non-Euclidean Monotone Operator Theory with Applications to Recurrent Neural Networks

    Get PDF
    We provide a novel transcription of monotone operator theory to the non-Euclidean finite-dimensional spaces ℓ1\ell_1 and ℓ∞\ell_{\infty}. We first establish properties of mappings which are monotone with respect to the non-Euclidean norms ℓ1\ell_1 or ℓ∞\ell_{\infty}. In analogy with their Euclidean counterparts, mappings which are monotone with respect to a non-Euclidean norm are amenable to numerous algorithms for computing their zeros. We demonstrate that several classic iterative methods for computing zeros of monotone operators are directly applicable in the non-Euclidean framework. We present a case-study in the equilibrium computation of recurrent neural networks and demonstrate that casting the computation as a suitable operator splitting problem improves convergence rates

    A Transfer Principle: Universal Approximators Between Metric Spaces From Euclidean Universal Approximators

    Full text link
    We build universal approximators of continuous maps between arbitrary Polish metric spaces X\mathcal{X} and Y\mathcal{Y} using universal approximators between Euclidean spaces as building blocks. Earlier results assume that the output space Y\mathcal{Y} is a topological vector space. We overcome this limitation by "randomization": our approximators output discrete probability measures over Y\mathcal{Y}. When X\mathcal{X} and Y\mathcal{Y} are Polish without additional structure, we prove very general qualitative guarantees; when they have suitable combinatorial structure, we prove quantitative guarantees for H\"older-like maps, including maps between finite graphs, solution operators to rough differential equations between certain Carnot groups, and continuous non-linear operators between Banach spaces arising in inverse problems. In particular, we show that the required number of Dirac measures is determined by the combinatorial structure of X\mathcal{X} and Y\mathcal{Y}. For barycentric Y\mathcal{Y}, including Banach spaces, R\mathbb{R}-trees, Hadamard manifolds, or Wasserstein spaces on Polish metric spaces, our approximators reduce to Y\mathcal{Y}-valued functions. When the Euclidean approximators are neural networks, our constructions generalize transformer networks, providing a new probabilistic viewpoint of geometric deep learning.Comment: 14 Figures, 3 Tables, 78 Pages (Main 40, Proofs 26, Acknowledgments and References 12

    Curve Your Attention: Mixed-Curvature Transformers for Graph Representation Learning

    Full text link
    Real-world graphs naturally exhibit hierarchical or cyclical structures that are unfit for the typical Euclidean space. While there exist graph neural networks that leverage hyperbolic or spherical spaces to learn representations that embed such structures more accurately, these methods are confined under the message-passing paradigm, making the models vulnerable against side-effects such as oversmoothing and oversquashing. More recent work have proposed global attention-based graph Transformers that can easily model long-range interactions, but their extensions towards non-Euclidean geometry are yet unexplored. To bridge this gap, we propose Fully Product-Stereographic Transformer, a generalization of Transformers towards operating entirely on the product of constant curvature spaces. When combined with tokenized graph Transformers, our model can learn the curvature appropriate for the input graph in an end-to-end fashion, without the need of additional tuning on different curvature initializations. We also provide a kernelized approach to non-Euclidean attention, which enables our model to run in time and memory cost linear to the number of nodes and edges while respecting the underlying geometry. Experiments on graph reconstruction and node classification demonstrate the benefits of generalizing Transformers to the non-Euclidean domain.Comment: 19 pages, 7 figure

    Change Detection in Graph Streams by Learning Graph Embeddings on Constant-Curvature Manifolds

    Get PDF
    The space of graphs is often characterised by a non-trivial geometry, which complicates learning and inference in practical applications. A common approach is to use embedding techniques to represent graphs as points in a conventional Euclidean space, but non-Euclidean spaces have often been shown to be better suited for embedding graphs. Among these, constant-curvature Riemannian manifolds (CCMs) offer embedding spaces suitable for studying the statistical properties of a graph distribution, as they provide ways to easily compute metric geodesic distances. In this paper, we focus on the problem of detecting changes in stationarity in a stream of attributed graphs. To this end, we introduce a novel change detection framework based on neural networks and CCMs, that takes into account the non-Euclidean nature of graphs. Our contribution in this work is twofold. First, via a novel approach based on adversarial learning, we compute graph embeddings by training an autoencoder to represent graphs on CCMs. Second, we introduce two novel change detection tests operating on CCMs. We perform experiments on synthetic data, as well as two real-world application scenarios: the detection of epileptic seizures using functional connectivity brain networks, and the detection of hostility between two subjects, using human skeletal graphs. Results show that the proposed methods are able to detect even small changes in a graph-generating process, consistently outperforming approaches based on Euclidean embeddings.Comment: 14 pages, 8 figure
    • …
    corecore