4,288 research outputs found

    Geometric deep learning: going beyond Euclidean data

    Get PDF
    Many scientific fields study data with an underlying structure that is a non-Euclidean space. Some examples include social networks in computational social sciences, sensor networks in communications, functional networks in brain imaging, regulatory networks in genetics, and meshed surfaces in computer graphics. In many applications, such geometric data are large and complex (in the case of social networks, on the scale of billions), and are natural targets for machine learning techniques. In particular, we would like to use deep neural networks, which have recently proven to be powerful tools for a broad range of problems from computer vision, natural language processing, and audio analysis. However, these tools have been most successful on data with an underlying Euclidean or grid-like structure, and in cases where the invariances of these structures are built into networks used to model them. Geometric deep learning is an umbrella term for emerging techniques attempting to generalize (structured) deep neural models to non-Euclidean domains such as graphs and manifolds. The purpose of this paper is to overview different examples of geometric deep learning problems and present available solutions, key difficulties, applications, and future research directions in this nascent field

    Aspherical gravitational monopoles

    Get PDF
    We show how to construct non-spherically-symmetric extended bodies of uniform density behaving exactly as pointlike masses. These ``gravitational monopoles'' have the following equivalent properties: (i) they generate, outside them, a spherically-symmetric gravitational potential M/∣x−xO∣M/|x - x_O|; (ii) their interaction energy with an external gravitational potential U(x)U(x) is −MU(xO)- M U(x_O); and (iii) all their multipole moments (of order l≥1l \geq 1) with respect to their center of mass OO vanish identically. The method applies for any number of space dimensions. The free parameters entering the construction are: (1) an arbitrary surface Σ\Sigma bounding a connected open subset Ω\Omega of R3R^3; (2) the arbitrary choice of the center of mass OO within Ω\Omega; and (3) the total volume of the body. An extension of the method allows one to construct homogeneous bodies which are gravitationally equivalent (in the sense of having exactly the same multipole moments) to any given body.Comment: 55 pages, Latex , submitted to Nucl.Phys.

    Spectal Harmonics: Bridging Spectral Embedding and Matrix Completion in Self-Supervised Learning

    Full text link
    Self-supervised methods received tremendous attention thanks to their seemingly heuristic approach to learning representations that respect the semantics of the data without any apparent supervision in the form of labels. A growing body of literature is already being published in an attempt to build a coherent and theoretically grounded understanding of the workings of a zoo of losses used in modern self-supervised representation learning methods. In this paper, we attempt to provide an understanding from the perspective of a Laplace operator and connect the inductive bias stemming from the augmentation process to a low-rank matrix completion problem. To this end, we leverage the results from low-rank matrix completion to provide theoretical analysis on the convergence of modern SSL methods and a key property that affects their downstream performance.Comment: 12 pages, 3 figure

    Diffusion Maps: Analysis and Applications

    Get PDF
    A lot of the data faced in science and engineering is not as complicated as it seems. There is the possibility of ¯nding low dimensional descriptions of this usually high dimensional data. One of the ways of achieving this is with the use of diffusion maps. Diffusion maps represent the dataset by a weighted graph in which points correspond to vertices and edges are weighted. The spectral properties of the graph Laplacian are then used to map the high dimensional data into a lower dimensional representation. The algorithm is introduced on simple test examples for which the low dimensional description is known. Justification of the algorithm is given by showing its equivalence to a suitable minimisation problem and to random walks on graphs. The description of random walks in terms of partial di®erential equations is discussed. The heat equation for a probability density function is derived and used to further analyse the algorithm. Applications of diffusion maps are presented at the end of this dissertation. The first application is clustering of data (i.e. partitioning of a data set into subsets so that the data points in each subset have similar characteristics). An approach based on di®usion maps (spectral clustering) is compared to the K-means clustering algorithm. We then discuss techniques for colour image quantization (reduction of distinct colours in an image). Finally, the diffusion maps are used to discover low dimensional description of high dimensional sets of images
    • …
    corecore