2 research outputs found

    On a Variational Definition for the Jensen-Shannon Symmetrization of Distances based on the Information Radius

    Full text link
    We generalize the Jensen-Shannon divergence by considering a variational definition with respect to a generic mean extending thereby the notion of Sibson's information radius. The variational definition applies to any arbitrary distance and yields another way to define a Jensen-Shannon symmetrization of distances. When the variational optimization is further constrained to belong to prescribed probability measure families, we get relative Jensen-Shannon divergences and symmetrizations which generalize the concept of information projections. Finally, we discuss applications of these variational Jensen-Shannon divergences and diversity indices to clustering and quantization tasks of probability measures including statistical mixtures.Comment: 28 pages, 2 figure

    Cumulant-free closed-form formulas for some common (dis)similarities between densities of an exponential family

    Full text link
    It is well-known that the Bhattacharyya, Hellinger, Kullback-Leibler, α\alpha-divergences, and Jeffreys' divergences between densities belonging to a same exponential family have generic closed-form formulas relying on the strictly convex and real-analytic cumulant function characterizing the exponential family. In this work, we report (dis)similarity formulas which bypass the explicit use of the cumulant function and highlight the role of quasi-arithmetic means and their multivariate mean operator extensions. In practice, these cumulant-free formulas are handy when implementing these (dis)similarities using legacy Application Programming Interfaces (APIs) since our method requires only to partially factorize the densities canonically of the considered exponential family.Comment: 33 page
    corecore