84,884 research outputs found

    On a characterization of directed divergence

    Get PDF
    Shannon's entropy was characterized by many authors by assuming different sets of postulates. One other measure associated with Shannon's entropy is directed divergence or information gain. In this paper, a characterization theorem for the measure directed divergence is given by assuming intuitively reasonable postulates and with the help of functional equations

    A simple probabilistic construction yielding generalized entropies and divergences, escort distributions and q-Gaussians

    Get PDF
    We give a simple probabilistic description of a transition between two states which leads to a generalized escort distribution. When the parameter of the distribution varies, it defines a parametric curve that we call an escort-path. The R\'enyi divergence appears as a natural by-product of the setting. We study the dynamics of the Fisher information on this path, and show in particular that the thermodynamic divergence is proportional to Jeffreys' divergence. Next, we consider the problem of inferring a distribution on the escort-path, subject to generalized moments constraints. We show that our setting naturally induces a rationale for the minimization of the R\'enyi information divergence. Then, we derive the optimum distribution as a generalized q-Gaussian distribution

    Hierarchical Features of Large-Scale Cortical Connectivity

    Full text link
    The analysis of complex networks has revealed patterns of organization in a variety of natural and artificial systems, including neuronal networks of the brain at multiple scales. In this paper, we describe a novel analysis of the large-scale connectivity between regions of the mammalian cerebral cortex, utilizing a set of hierarchical measurements proposed recently. We examine previously identified functional clusters of brain regions in macaque visual cortex and cat cortex and find significant differences between such clusters in terms of several hierarchical measures, revealing differences in how these clusters are embedded in the overall cortical architecture. For example, the ventral cluster of visual cortex maintains structurally more segregated, less divergent connections than the dorsal cluster, which may point to functionally different roles of their constituent brain regions.Comment: 17 pages, 6 figure

    Time series irreversibility: a visibility graph approach

    Get PDF
    We propose a method to measure real-valued time series irreversibility which combines two differ- ent tools: the horizontal visibility algorithm and the Kullback-Leibler divergence. This method maps a time series to a directed network according to a geometric criterion. The degree of irreversibility of the series is then estimated by the Kullback-Leibler divergence (i.e. the distinguishability) between the in and out degree distributions of the associated graph. The method is computationally effi- cient, does not require any ad hoc symbolization process, and naturally takes into account multiple scales. We find that the method correctly distinguishes between reversible and irreversible station- ary time series, including analytical and numerical studies of its performance for: (i) reversible stochastic processes (uncorrelated and Gaussian linearly correlated), (ii) irreversible stochastic pro- cesses (a discrete flashing ratchet in an asymmetric potential), (iii) reversible (conservative) and irreversible (dissipative) chaotic maps, and (iv) dissipative chaotic maps in the presence of noise. Two alternative graph functionals, the degree and the degree-degree distributions, can be used as the Kullback-Leibler divergence argument. The former is simpler and more intuitive and can be used as a benchmark, but in the case of an irreversible process with null net current, the degree-degree distribution has to be considered to identifiy the irreversible nature of the series.Comment: submitted for publicatio

    Quadratically-Regularized Optimal Transport on Graphs

    Full text link
    Optimal transportation provides a means of lifting distances between points on a geometric domain to distances between signals over the domain, expressed as probability distributions. On a graph, transportation problems can be used to express challenging tasks involving matching supply to demand with minimal shipment expense; in discrete language, these become minimum-cost network flow problems. Regularization typically is needed to ensure uniqueness for the linear ground distance case and to improve optimization convergence; state-of-the-art techniques employ entropic regularization on the transportation matrix. In this paper, we explore a quadratic alternative to entropic regularization for transport over a graph. We theoretically analyze the behavior of quadratically-regularized graph transport, characterizing how regularization affects the structure of flows in the regime of small but nonzero regularization. We further exploit elegant second-order structure in the dual of this problem to derive an easily-implemented Newton-type optimization algorithm.Comment: 27 page
    • …
    corecore