126 research outputs found

    On the geometry of mixed states and the Fisher information tensor

    Full text link
    In this paper, we will review the co-adjoint orbit formulation of finite dimensional quantum mechanics, and in this framework, we will interpret the notion of quantum Fisher information index (and metric). Following previous work of part of the authors, who introduced the definition of Fisher information tensor, we will show how its antisymmetric part is the pullback of the natural Kostant-Kirillov-Souriau symplectic form along some natural diffeomorphism. In order to do this, we will need to understand the symmetric logarithmic derivative as a proper 1-form, settling the issues about its very definition and explicit computation. Moreover, the fibration of co-adjoint orbits, seen as spaces of mixed states, is also discussed.Comment: 27 pages; Accepted Manuscrip

    Statistical Geometry in Quantum Mechanics

    Full text link
    A statistical model M is a family of probability distributions, characterised by a set of continuous parameters known as the parameter space. This possesses natural geometrical properties induced by the embedding of the family of probability distributions into the Hilbert space H. By consideration of the square-root density function we can regard M as a submanifold of the unit sphere in H. Therefore, H embodies the `state space' of the probability distributions, and the geometry of M can be described in terms of the embedding of in H. The geometry in question is characterised by a natural Riemannian metric (the Fisher-Rao metric), thus allowing us to formulate the principles of classical statistical inference in a natural geometric setting. In particular, we focus attention on the variance lower bounds for statistical estimation, and establish generalisations of the classical Cramer-Rao and Bhattacharyya inequalities. The statistical model M is then specialised to the case of a submanifold of the state space of a quantum mechanical system. This is pursued by introducing a compatible complex structure on the underlying real Hilbert space, which allows the operations of ordinary quantum mechanics to be reinterpreted in the language of real Hilbert space geometry. The application of generalised variance bounds in the case of quantum statistical estimation leads to a set of higher order corrections to the Heisenberg uncertainty relations for canonically conjugate observables.Comment: 32 pages, LaTex file, Extended version to include quantum measurement theor

    Interest Rates and Information Geometry

    Full text link
    The space of probability distributions on a given sample space possesses natural geometric properties. For example, in the case of a smooth parametric family of probability distributions on the real line, the parameter space has a Riemannian structure induced by the embedding of the family into the Hilbert space of square-integrable functions, and is characterised by the Fisher-Rao metric. In the nonparametric case the relevant geometry is determined by the spherical distance function of Bhattacharyya. In the context of term structure modelling, we show that minus the derivative of the discount function with respect to the maturity date gives rise to a probability density. This follows as a consequence of the positivity of interest rates. Therefore, by mapping the density functions associated with a given family of term structures to Hilbert space, the resulting metrical geometry can be used to analyse the relationship of yield curves to one another. We show that the general arbitrage-free yield curve dynamics can be represented as a process taking values in the convex space of smooth density functions on the positive real line. It follows that the theory of interest rate dynamics can be represented by a class of processes in Hilbert space. We also derive the dynamics for the central moments associated with the distribution determined by the yield curve.Comment: 20 pages, 3 figure

    The Bregman chord divergence

    Full text link
    Distances are fundamental primitives whose choice significantly impacts the performances of algorithms in machine learning and signal processing. However selecting the most appropriate distance for a given task is an endeavor. Instead of testing one by one the entries of an ever-expanding dictionary of {\em ad hoc} distances, one rather prefers to consider parametric classes of distances that are exhaustively characterized by axioms derived from first principles. Bregman divergences are such a class. However fine-tuning a Bregman divergence is delicate since it requires to smoothly adjust a functional generator. In this work, we propose an extension of Bregman divergences called the Bregman chord divergences. This new class of distances does not require gradient calculations, uses two scalar parameters that can be easily tailored in applications, and generalizes asymptotically Bregman divergences.Comment: 10 page

    A simple probabilistic construction yielding generalized entropies and divergences, escort distributions and q-Gaussians

    Get PDF
    We give a simple probabilistic description of a transition between two states which leads to a generalized escort distribution. When the parameter of the distribution varies, it defines a parametric curve that we call an escort-path. The R\'enyi divergence appears as a natural by-product of the setting. We study the dynamics of the Fisher information on this path, and show in particular that the thermodynamic divergence is proportional to Jeffreys' divergence. Next, we consider the problem of inferring a distribution on the escort-path, subject to generalized moments constraints. We show that our setting naturally induces a rationale for the minimization of the R\'enyi information divergence. Then, we derive the optimum distribution as a generalized q-Gaussian distribution

    Boundary regularity of rotating vortex patches

    Full text link
    We show that the boundary of a rotating vortex patch (or V-state, in the terminology of Deem and Zabusky) is of class C^infinity provided the patch is close enough to the bifurcation circle in the Lipschitz norm. The rotating patch is convex if it is close enough to the bifurcation circle in the C^2 norm. Our proof is based on Burbea's approach to V-states. Thus conformal mapping plays a relevant role as well as estimating, on H\"older spaces, certain non-convolution singular integral operators of Calder\'on-Zygmund type.Comment: Various proofs have been shortened. One added referenc

    Generalization of entropy based divergence measures for symbolic sequence analysis

    Get PDF
    Entropy based measures have been frequently used in symbolic sequence analysis. A symmetrized and smoothed form of Kullback-Leibler divergence or relative entropy, the Jensen-Shannon divergence (JSD), is of particular interest because of its sharing properties with families of other divergence measures and its interpretability in different domains including statistical physics, information theory and mathematical statistics. The uniqueness and versatility of this measure arise because of a number of attributes including generalization to any number of probability distributions and association of weights to the distributions. Furthermore, its entropic formulation allows its generalization in different statistical frameworks, such as, non-extensive Tsallis statistics and higher order Markovian statistics. We revisit these generalizations and propose a new generalization of JSD in the integrated Tsallis and Markovian statistical framework. We show that this generalization can be interpreted in terms of mutual information. We also investigate the performance of different JSD generalizations in deconstructing chimeric DNA sequences assembled from bacterial genomes including that of E. coli, S. enterica typhi, Y. pestis and H. influenzae. Our results show that the JSD generalizations bring in more pronounced improvements when the sequences being compared are from phylogenetically proximal organisms, which are often difficult to distinguish because of their compositional similarity. While small but noticeable improvements were observed with the Tsallis statistical JSD generalization, relatively large improvements were observed with the Markovian generalization. In contrast, the proposed Tsallis-Markovian generalization yielded more pronounced improvements relative to the Tsallis and Markovian generalizations, specifically when the sequences being compared arose from phylogenetically proximal organisms.publishedVersionFil: Ré, Miguel Ángel. Universidad Tecnológica Nacional. Facultad Regional Córdoba. Centro de Investigación en Informática para la Ingeniería. Departamento de Ciencias Básicas; Argentina.Fil: Ré, Miguel Ángel. Universidad Nacional de Córdoba. Facultad de Matemática, Astronomía y Física; Argentina.Fil: Azad, Rajeev K. University of North Texas. College of Science. Department of Biological Sciences; Estados Unidos de América.Fil: Azad, Rajeev K. University of North Texas. College of Science. Department of Mathematics; Estados Unidos de América.Ciencias de la Información y Bioinformática (desarrollo de hardware va en 2.2 "Ingeniería Eléctrica, Electrónica y de Información" y los aspectos sociales van en 5.8 "Comunicación y Medios"

    Statistical Computing on Non-Linear Spaces for Computational Anatomy

    Get PDF
    International audienceComputational anatomy is an emerging discipline that aims at analyzing and modeling the individual anatomy of organs and their biological variability across a population. However, understanding and modeling the shape of organs is made difficult by the absence of physical models for comparing different subjects, the complexity of shapes, and the high number of degrees of freedom implied. Moreover, the geometric nature of the anatomical features usually extracted raises the need for statistics on objects like curves, surfaces and deformations that do not belong to standard Euclidean spaces. We explain in this chapter how the Riemannian structure can provide a powerful framework to build generic statistical computing tools. We show that few computational tools derive for each Riemannian metric can be used in practice as the basic atoms to build more complex generic algorithms such as interpolation, filtering and anisotropic diffusion on fields of geometric features. This computational framework is illustrated with the analysis of the shape of the scoliotic spine and the modeling of the brain variability from sulcal lines where the results suggest new anatomical findings
    corecore