6 research outputs found

    Varentropy Decreases Under the Polar Transform

    Get PDF
    We consider the evolution of variance of entropy (varentropy) in the course of a polar transform operation on binary data elements (BDEs). A BDE is a pair (X,Y)(X,Y) consisting of a binary random variable XX and an arbitrary side information random variable YY. The varentropy of (X,Y)(X,Y) is defined as the variance of the random variable logpXY(XY)-\log p_{X|Y}(X|Y). A polar transform of order two is a certain mapping that takes two independent BDEs and produces two new BDEs that are correlated with each other. It is shown that the sum of the varentropies at the output of the polar transform is less than or equal to the sum of the varentropies at the input, with equality if and only if at least one of the inputs has zero varentropy. This result is extended to polar transforms of higher orders and it is shown that the varentropy decreases to zero asymptotically when the BDEs at the input are independent and identially distributed.Comment: Presented in part at ISIT 2014. Accepted for publication in the IEEE Trans. Inform. Theory, March 201

    Extropy and Varextropy estimators with applications

    Full text link
    In many statistical studies, the measure of uncertainties like entropy, extropy, varentropy and varextropy of a distribution function is of prime interest. This paper proposes estimators of extropy and varextropy. Proposed estimators are consistent. Based on extropy estimator, a test of symmetry is given. The proposed test has the advantage that we do not need to estimate the centre of symmetry. The critical value and power of the proposed test statistics have been obtained. The test procedure has been implemented on six real-life data sets to verify its performance in identifying the symmetric nature.Comment: arXiv admin note: text overlap with arXiv:2209.0670

    A NEW GENERALIZED VARENTROPY AND ITS PROPERTIES

    Get PDF
    The variance of Shannon information related to the random variable XX, which is called varentropy, is a measurement that indicates, how the information content of XX is scattered around its entropy and explains its various applications in information theory, computer sciences, and statistics. In this paper, we introduce a new generalized varentropy based on the Tsallis entropy and also obtain some results and bounds for it. We compare the varentropy with the Tsallis varentropy. Moreover, we explain the Tsallis varentropy of the order statistics and analyse this concept in residual (past) lifetime distributions and then introduce two new classes of distributions by them

    A New Generalized Varentropy and its Properties

    Get PDF
    The variance of Shannon information related to the random variable X, which is called varentropy, is a measurement that indicates, how the information content of X is scattered around its entropy and explains its various applications in information theory, computer sciences, and statistics. In this paper, we introduce a new generalized varentropy based on the Tsallis entropy and also obtain some results and bounds for it. We compare the varentropy with the Tsallis varentropy. Moreover, we explain the Tsallis varentropy of the order statistics and analyse this concept in residual (past) lifetime distributions and then introduce two new classes of distributions by them.The authors would like to thank the editor and anonymous referees for their valuable comments and suggestions that improved the quality of the paper

    Varentropy Decreases Under the Polar Transform

    No full text
    corecore