244 research outputs found

    Topological and Algebraic Properties of Chernoff Information between Gaussian Graphs

    Full text link
    In this paper, we want to find out the determining factors of Chernoff information in distinguishing a set of Gaussian graphs. We find that Chernoff information of two Gaussian graphs can be determined by the generalized eigenvalues of their covariance matrices. We find that the unit generalized eigenvalue doesn't affect Chernoff information and its corresponding dimension doesn't provide information for classification purpose. In addition, we can provide a partial ordering using Chernoff information between a series of Gaussian trees connected by independent grafting operations. With the relationship between generalized eigenvalues and Chernoff information, we can do optimal linear dimension reduction with least loss of information for classification.Comment: Submitted to Allerton2018, and this version contains proofs of the propositions in the pape

    Equivalence of concentration inequalities for linear and non-linear functions

    Get PDF
    We consider a random variable XX that takes values in a (possibly infinite-dimensional) topological vector space X\mathcal{X}. We show that, with respect to an appropriate "normal distance" on X\mathcal{X}, concentration inequalities for linear and non-linear functions of XX are equivalent. This normal distance corresponds naturally to the concentration rate in classical concentration results such as Gaussian concentration and concentration on the Euclidean and Hamming cubes. Under suitable assumptions on the roundness of the sets of interest, the concentration inequalities so obtained are asymptotically optimal in the high-dimensional limit.Comment: 19 pages, 5 figure

    Low-Dimensional Topology of Information Fusion

    Full text link
    We provide an axiomatic characterization of information fusion, on the basis of which we define an information fusion network. Our construction is reminiscent of tangle diagrams in low dimensional topology. Information fusion networks come equipped with a natural notion of equivalence. Equivalent networks `contain the same information', but differ locally. When fusing streams of information, an information fusion network may adaptively optimize itself inside its equivalence class. This provides a fault tolerance mechanism for such networks.Comment: 8 pages. Conference proceedings version. Will be superceded by a journal versio

    Model selection and local geometry

    Full text link
    We consider problems in model selection caused by the geometry of models close to their points of intersection. In some cases---including common classes of causal or graphical models, as well as time series models---distinct models may nevertheless have identical tangent spaces. This has two immediate consequences: first, in order to obtain constant power to reject one model in favour of another we need local alternative hypotheses that decrease to the null at a slower rate than the usual parametric n−1/2n^{-1/2} (typically we will require n−1/4n^{-1/4} or slower); in other words, to distinguish between the models we need large effect sizes or very large sample sizes. Second, we show that under even weaker conditions on their tangent cones, models in these classes cannot be made simultaneously convex by a reparameterization. This shows that Bayesian network models, amongst others, cannot be learned directly with a convex method similar to the graphical lasso. However, we are able to use our results to suggest methods for model selection that learn the tangent space directly, rather than the model itself. In particular, we give a generic algorithm for learning Bayesian network models

    On the Exact Evaluation of Certain Instances of the Potts Partition Function by Quantum Computers

    Get PDF
    We present an efficient quantum algorithm for the exact evaluation of either the fully ferromagnetic or anti-ferromagnetic q-state Potts partition function Z for a family of graphs related to irreducible cyclic codes. This problem is related to the evaluation of the Jones and Tutte polynomials. We consider the connection between the weight enumerator polynomial from coding theory and Z and exploit the fact that there exists a quantum algorithm for efficiently estimating Gauss sums in order to obtain the weight enumerator for a certain class of linear codes. In this way we demonstrate that for a certain class of sparse graphs, which we call Irreducible Cyclic Cocycle Code (ICCC_\epsilon) graphs, quantum computers provide a polynomial speed up in the difference between the number of edges and vertices of the graph, and an exponential speed up in q, over the best classical algorithms known to date

    Revisiting Chernoff Information with Likelihood Ratio Exponential Families

    Full text link
    The Chernoff information between two probability measures is a statistical divergence measuring their deviation defined as their maximally skewed Bhattacharyya distance. Although the Chernoff information was originally introduced for bounding the Bayes error in statistical hypothesis testing, the divergence found many other applications due to its empirical robustness property found in applications ranging from information fusion to quantum information. From the viewpoint of information theory, the Chernoff information can also be interpreted as a minmax symmetrization of the Kullback--Leibler divergence. In this paper, we first revisit the Chernoff information between two densities of a measurable Lebesgue space by considering the exponential families induced by their geometric mixtures: The so-called likelihood ratio exponential families. Second, we show how to (i) solve exactly the Chernoff information between any two univariate Gaussian distributions or get a closed-form formula using symbolic computing, (ii) report a closed-form formula of the Chernoff information of centered Gaussians with scaled covariance matrices and (iii) use a fast numerical scheme to approximate the Chernoff information between any two multivariate Gaussian distributions.Comment: 41 page

    Bibliographie

    Get PDF
    • …
    corecore