9 research outputs found

    Varentropy Decreases Under the Polar Transform

    Get PDF
    We consider the evolution of variance of entropy (varentropy) in the course of a polar transform operation on binary data elements (BDEs). A BDE is a pair (X,Y)(X,Y) consisting of a binary random variable XX and an arbitrary side information random variable YY. The varentropy of (X,Y)(X,Y) is defined as the variance of the random variable logpXY(XY)-\log p_{X|Y}(X|Y). A polar transform of order two is a certain mapping that takes two independent BDEs and produces two new BDEs that are correlated with each other. It is shown that the sum of the varentropies at the output of the polar transform is less than or equal to the sum of the varentropies at the input, with equality if and only if at least one of the inputs has zero varentropy. This result is extended to polar transforms of higher orders and it is shown that the varentropy decreases to zero asymptotically when the BDEs at the input are independent and identially distributed.Comment: Presented in part at ISIT 2014. Accepted for publication in the IEEE Trans. Inform. Theory, March 201

    A NEW GENERALIZED VARENTROPY AND ITS PROPERTIES

    Get PDF
    The variance of Shannon information related to the random variable XX, which is called varentropy, is a measurement that indicates, how the information content of XX is scattered around its entropy and explains its various applications in information theory, computer sciences, and statistics. In this paper, we introduce a new generalized varentropy based on the Tsallis entropy and also obtain some results and bounds for it. We compare the varentropy with the Tsallis varentropy. Moreover, we explain the Tsallis varentropy of the order statistics and analyse this concept in residual (past) lifetime distributions and then introduce two new classes of distributions by them

    A New Generalized Varentropy and its Properties

    Get PDF
    The variance of Shannon information related to the random variable X, which is called varentropy, is a measurement that indicates, how the information content of X is scattered around its entropy and explains its various applications in information theory, computer sciences, and statistics. In this paper, we introduce a new generalized varentropy based on the Tsallis entropy and also obtain some results and bounds for it. We compare the varentropy with the Tsallis varentropy. Moreover, we explain the Tsallis varentropy of the order statistics and analyse this concept in residual (past) lifetime distributions and then introduce two new classes of distributions by them.The authors would like to thank the editor and anonymous referees for their valuable comments and suggestions that improved the quality of the paper

    D11.2 Consolidated results on the performance limits of wireless communications

    Get PDF
    Deliverable D11.2 del projecte europeu NEWCOM#The report presents the Intermediate Results of N# JRAs on Performance Limits of Wireless Communications and highlights the fundamental issues that have been investigated by the WP1.1. The report illustrates the Joint Research Activities (JRAs) already identified during the first year of the project which are currently ongoing. For each activity there is a description, an illustration of the adherence and relevance with the identified fundamental open issues, a short presentation of the preliminary results, and a roadmap for the joint research work in the next year. Appendices for each JRA give technical details on the scientific activity in each JRA.Peer ReviewedPreprin

    Varentropy Decreases Under the Polar Transform

    No full text
    corecore