846 research outputs found

    Information-geometric Markov Chain Monte Carlo methods using Diffusions

    Get PDF
    Recent work incorporating geometric ideas in Markov chain Monte Carlo is reviewed in order to highlight these advances and their possible application in a range of domains beyond Statistics. A full exposition of Markov chains and their use in Monte Carlo simulation for Statistical inference and molecular dynamics is provided, with particular emphasis on methods based on Langevin diffusions. After this geometric concepts in Markov chain Monte Carlo are introduced. A full derivation of the Langevin diffusion on a Riemannian manifold is given, together with a discussion of appropriate Riemannian metric choice for different problems. A survey of applications is provided, and some open questions are discussed.Comment: 22 pages, 2 figure

    Geometric deep learning: going beyond Euclidean data

    Get PDF
    Many scientific fields study data with an underlying structure that is a non-Euclidean space. Some examples include social networks in computational social sciences, sensor networks in communications, functional networks in brain imaging, regulatory networks in genetics, and meshed surfaces in computer graphics. In many applications, such geometric data are large and complex (in the case of social networks, on the scale of billions), and are natural targets for machine learning techniques. In particular, we would like to use deep neural networks, which have recently proven to be powerful tools for a broad range of problems from computer vision, natural language processing, and audio analysis. However, these tools have been most successful on data with an underlying Euclidean or grid-like structure, and in cases where the invariances of these structures are built into networks used to model them. Geometric deep learning is an umbrella term for emerging techniques attempting to generalize (structured) deep neural models to non-Euclidean domains such as graphs and manifolds. The purpose of this paper is to overview different examples of geometric deep learning problems and present available solutions, key difficulties, applications, and future research directions in this nascent field

    From the Jordan product to Riemannian geometries on classical and quantum states

    Get PDF
    The Jordan product on the self-adjoint part of a finite-dimensional C∗C^{*}-algebra A\mathscr{A} is shown to give rise to Riemannian metric tensors on suitable manifolds of states on A\mathscr{A}, and the covariant derivative, the geodesics, the Riemann tensor, and the sectional curvature of all these metric tensors are explicitly computed. In particular, it is proved that the Fisher--Rao metric tensor is recovered in the Abelian case, that the Fubini--Study metric tensor is recovered when we consider pure states on the algebra B(H)\mathcal{B}(\mathcal{H}) of linear operators on a finite-dimensional Hilbert space H\mathcal{H}, and that the Bures--Helstrom metric tensors is recovered when we consider faithful states on B(H)\mathcal{B}(\mathcal{H}). Moreover, an alternative derivation of these Riemannian metric tensors in terms of the GNS construction associated to a state is presented. In the case of pure and faithful states on B(H)\mathcal{B}(\mathcal{H}), this alternative geometrical description clarifies the analogy between the Fubini--Study and the Bures--Helstrom metric tensor.Comment: 32 pages. Minor improvements. References added. Comments are welcome

    Recent advances in directional statistics

    Get PDF
    Mainstream statistical methodology is generally applicable to data observed in Euclidean space. There are, however, numerous contexts of considerable scientific interest in which the natural supports for the data under consideration are Riemannian manifolds like the unit circle, torus, sphere and their extensions. Typically, such data can be represented using one or more directions, and directional statistics is the branch of statistics that deals with their analysis. In this paper we provide a review of the many recent developments in the field since the publication of Mardia and Jupp (1999), still the most comprehensive text on directional statistics. Many of those developments have been stimulated by interesting applications in fields as diverse as astronomy, medicine, genetics, neurology, aeronautics, acoustics, image analysis, text mining, environmetrics, and machine learning. We begin by considering developments for the exploratory analysis of directional data before progressing to distributional models, general approaches to inference, hypothesis testing, regression, nonparametric curve estimation, methods for dimension reduction, classification and clustering, and the modelling of time series, spatial and spatio-temporal data. An overview of currently available software for analysing directional data is also provided, and potential future developments discussed.Comment: 61 page
    • 

    corecore