13,752 research outputs found

    A note on bounded entropies

    Full text link
    The aim of the paper is to study the link between non additivity of some entropies and their boundedness. We propose an axiomatic construction of the entropy relying on the fact that entropy belongs to a group isomorphic to the usual additive group. This allows to show that the entropies that are additive with respect to the addition of the group for independent random variables are nonlinear transforms of the R\'enyi entropies, including the particular case of the Shannon entropy. As a particular example, we study as a group a bounded interval in which the addition is a generalization of the addition of velocities in special relativity. We show that Tsallis-Havrda-Charvat entropy is included in the family of entropies we define. Finally, a link is made between the approach developed in the paper and the theory of deformed logarithms.Comment: 10 pages, 1 figur

    A Rate-Splitting Approach to Fading Channels with Imperfect Channel-State Information

    Full text link
    As shown by M\'edard, the capacity of fading channels with imperfect channel-state information (CSI) can be lower-bounded by assuming a Gaussian channel input XX with power PP and by upper-bounding the conditional entropy h(X∣Y,H^)h(X|Y,\hat{H}) by the entropy of a Gaussian random variable with variance equal to the linear minimum mean-square error in estimating XX from (Y,H^)(Y,\hat{H}). We demonstrate that, using a rate-splitting approach, this lower bound can be sharpened: by expressing the Gaussian input XX as the sum of two independent Gaussian variables X1X_1 and X2X_2 and by applying M\'edard's lower bound first to bound the mutual information between X1X_1 and YY while treating X2X_2 as noise, and by applying it a second time to the mutual information between X2X_2 and YY while assuming X1X_1 to be known, we obtain a capacity lower bound that is strictly larger than M\'edard's lower bound. We then generalize this approach to an arbitrary number LL of layers, where XX is expressed as the sum of LL independent Gaussian random variables of respective variances PℓP_{\ell}, ℓ=1,…,L\ell = 1,\dotsc,L summing up to PP. Among all such rate-splitting bounds, we determine the supremum over power allocations PℓP_\ell and total number of layers LL. This supremum is achieved for L→∞L\to\infty and gives rise to an analytically expressible capacity lower bound. For Gaussian fading, this novel bound is shown to converge to the Gaussian-input mutual information as the signal-to-noise ratio (SNR) grows, provided that the variance of the channel estimation error H−H^H-\hat{H} tends to zero as the SNR tends to infinity.Comment: 28 pages, 8 figures, submitted to IEEE Transactions on Information Theory. Revised according to first round of review
    • …
    corecore