65 research outputs found
Group entropies, correlation laws and zeta functions
The notion of group entropy is proposed. It enables to unify and generalize
many different definitions of entropy known in the literature, as those of
Boltzmann-Gibbs, Tsallis, Abe and Kaniadakis. Other new entropic functionals
are presented, related to nontrivial correlation laws characterizing
universality classes of systems out of equilibrium, when the dynamics is weakly
chaotic. The associated thermostatistics are discussed. The mathematical
structure underlying our construction is that of formal group theory, which
provides the general structure of the correlations among particles and dictates
the associated entropic functionals. As an example of application, the role of
group entropies in information theory is illustrated and generalizations of the
Kullback-Leibler divergence are proposed. A new connection between statistical
mechanics and zeta functions is established. In particular, Tsallis entropy is
related to the classical Riemann zeta function.Comment: to appear in Physical Review
Properties of Classical and Quantum Jensen-Shannon Divergence
Jensen-Shannon divergence (JD) is a symmetrized and smoothed version of the
most important divergence measure of information theory, Kullback divergence.
As opposed to Kullback divergence it determines in a very direct way a metric;
indeed, it is the square of a metric. We consider a family of divergence
measures (JD_alpha for alpha>0), the Jensen divergences of order alpha, which
generalize JD as JD_1=JD. Using a result of Schoenberg, we prove that JD_alpha
is the square of a metric for alpha lies in the interval (0,2], and that the
resulting metric space of probability distributions can be isometrically
embedded in a real Hilbert space. Quantum Jensen-Shannon divergence (QJD) is a
symmetrized and smoothed version of quantum relative entropy and can be
extended to a family of quantum Jensen divergences of order alpha (QJD_alpha).
We strengthen results by Lamberti et al. by proving that for qubits and pure
states, QJD_alpha^1/2 is a metric space which can be isometrically embedded in
a real Hilbert space when alpha lies in the interval (0,2]. In analogy with
Burbea and Rao's generalization of JD, we also define general QJD by
associating a Jensen-type quantity to any weighted family of states.
Appropriate interpretations of quantities introduced are discussed and bounds
are derived in terms of the total variation and trace distance.Comment: 13 pages, LaTeX, expanded contents, added references and corrected
typo
Some inequalities on generalized entropies
We give several inequalities on generalized entropies involving Tsallis
entropies, using some inequalities obtained by improvements of Young's
inequality. We also give a generalized Han's inequality.Comment: 15 page
Statistical Mechanics and Information-Theoretic Perspectives on Complexity in the Earth System
Peer reviewedPublisher PD
Generalization of entropy based divergence measures for symbolic sequence analysis
Entropy based measures have been frequently used in symbolic sequence analysis. A symmetrized and smoothed form of Kullback-Leibler divergence or relative entropy, the Jensen-Shannon divergence (JSD), is of particular interest because of its sharing properties with families of other divergence measures and its interpretability in different domains including statistical physics, information theory and mathematical statistics. The uniqueness and versatility of this measure arise because of a number of attributes including generalization to any number of probability distributions and association of weights to the distributions. Furthermore, its entropic formulation allows its generalization in different statistical frameworks, such as, non-extensive Tsallis statistics and higher order Markovian statistics. We revisit these generalizations and propose a new generalization of JSD in the integrated Tsallis and Markovian statistical framework. We show that this generalization can be interpreted in terms of mutual information. We also investigate the performance of different JSD generalizations in deconstructing chimeric DNA sequences assembled from bacterial genomes including that of E. coli, S. enterica typhi, Y. pestis and H. influenzae. Our results show that the JSD generalizations bring in more pronounced improvements when the sequences being compared are from phylogenetically proximal organisms, which are often difficult to distinguish because of their compositional similarity. While small but noticeable improvements were observed with the Tsallis statistical JSD generalization, relatively large improvements were observed with the Markovian generalization. In contrast, the proposed Tsallis-Markovian generalization yielded more pronounced improvements relative to the Tsallis and Markovian generalizations, specifically when the sequences being compared arose from phylogenetically proximal organisms.publishedVersionFil: Ré, Miguel Ángel. Universidad Tecnológica Nacional. Facultad Regional Córdoba. Centro de Investigación en Informática para la Ingeniería. Departamento de Ciencias Básicas; Argentina.Fil: Ré, Miguel Ángel. Universidad Nacional de Córdoba. Facultad de Matemática, Astronomía y Física; Argentina.Fil: Azad, Rajeev K. University of North Texas. College of Science. Department of Biological Sciences; Estados Unidos de América.Fil: Azad, Rajeev K. University of North Texas. College of Science. Department of Mathematics; Estados Unidos de América.Ciencias de la Información y Bioinformática (desarrollo de hardware va en 2.2 "Ingeniería Eléctrica, Electrónica y de Información" y los aspectos sociales van en 5.8 "Comunicación y Medios"
- …