1,196 research outputs found

    A Dual Measure of Uncertainty: The Deng Extropy

    Get PDF
    The extropy has recently been introduced as the dual concept of entropy. Moreover, in the context of the Dempster–Shafer evidence theory, Deng studied a new measure of discrimination, named the Deng entropy. In this paper, we define the Deng extropy and study its relation with Deng entropy, and examples are proposed in order to compare them. The behaviour of Deng extropy is studied under changes of focal elements. A characterization result is given for the maximum Deng extropy and, finally, a numerical example in pattern recognition is discussed in order to highlight the relevance of the new measure

    On extropy of past lifetime distribution

    Full text link
    Recently Qiu et al. (2017) have introduced residual extropy as measure of uncertainty in residual lifetime distributions analogues to residual entropy (1996). Also, they obtained some properties and applications of that. In this paper, we study the extropy to measure the uncertainty in a past lifetime distribution. This measure of uncertainty is called past extropy. Also it is showed a characterization result about the past extropy of largest order statistics

    On Cumulative Entropies in Terms of Moments of Order Statistics

    Get PDF
    In this paper, relations between some kinds of cumulative entropies and moments of order statistics are established. By using some characterizations and the symmetry of a non-negative and absolutely continuous random variable XX, lower and upper bounds for entropies are obtained and illustrative examples are given. By the relations with the moments of order statistics, a method is shown to compute an estimate of cumulative entropies and an application to testing whether data are exponentially distributed is outlined

    On cumulative entropies in terms of moments of order statistics

    Get PDF
    In this paper relations among some kinds of cumulative entropies and moments of order statistics are presented. By using some characterizations and the symmetry of a non negative and absolutely continuous random variable X, lower and upper bounds for entropies are obtained and examples are given.Comment: 13 pages, 1 tabl

    On Tsallis extropy with an application to pattern recognition

    Full text link
    Recently, a new measure of information called extropy has been introduced by Lad, Sanfilippo and Agr\`o as the dual version of Shannon entropy. In the literature, Tsallis introduced a measure for a discrete random variable, named Tsallis entropy, as a generalization of Boltzmann-Gibbs statistics. In this work, a new measure of discrimination, called Tsallis extropy, is introduced and some of its properties are then discussed. The relation between Tsallis extropy and entropy is given and some bounds are also presented. Finally, an application of this extropy to pattern recognition is demonstrated
    • …
    corecore