1,587 research outputs found

    Topological and Algebraic Properties of Chernoff Information between Gaussian Graphs

    Full text link
    In this paper, we want to find out the determining factors of Chernoff information in distinguishing a set of Gaussian graphs. We find that Chernoff information of two Gaussian graphs can be determined by the generalized eigenvalues of their covariance matrices. We find that the unit generalized eigenvalue doesn't affect Chernoff information and its corresponding dimension doesn't provide information for classification purpose. In addition, we can provide a partial ordering using Chernoff information between a series of Gaussian trees connected by independent grafting operations. With the relationship between generalized eigenvalues and Chernoff information, we can do optimal linear dimension reduction with least loss of information for classification.Comment: Submitted to Allerton2018, and this version contains proofs of the propositions in the pape

    Approximations of Shannon Mutual Information for Discrete Variables with Applications to Neural Population Coding

    Full text link
    Although Shannon mutual information has been widely used, its effective calculation is often difficult for many practical problems, including those in neural population coding. Asymptotic formulas based on Fisher information sometimes provide accurate approximations to the mutual information but this approach is restricted to continuous variables because the calculation of Fisher information requires derivatives with respect to the encoded variables. In this paper, we consider information-theoretic bounds and approximations of the mutual information based on Kullback--Leibler divergence and R\'{e}nyi divergence. We propose several information metrics to approximate Shannon mutual information in the context of neural population coding. While our asymptotic formulas all work for discrete variables, one of them has consistent performance and high accuracy regardless of whether the encoded variables are discrete or continuous. We performed numerical simulations and confirmed that our approximation formulas were highly accurate for approximating the mutual information between the stimuli and the responses of a large neural population. These approximation formulas may potentially bring convenience to the applications of information theory to many practical and theoretical problems.Comment: 31 pages, 6 figure

    Distributed Detection over Random Networks: Large Deviations Performance Analysis

    Full text link
    We study the large deviations performance, i.e., the exponential decay rate of the error probability, of distributed detection algorithms over random networks. At each time step kk each sensor: 1) averages its decision variable with the neighbors' decision variables; and 2) accounts on-the-fly for its new observation. We show that distributed detection exhibits a "phase change" behavior. When the rate of network information flow (the speed of averaging) is above a threshold, then distributed detection is asymptotically equivalent to the optimal centralized detection, i.e., the exponential decay rate of the error probability for distributed detection equals the Chernoff information. When the rate of information flow is below a threshold, distributed detection achieves only a fraction of the Chernoff information rate; we quantify this achievable rate as a function of the network rate of information flow. Simulation examples demonstrate our theoretical findings on the behavior of distributed detection over random networks.Comment: 30 pages, journal, submitted on December 3rd, 201
    • …
    corecore