13,752 research outputs found
A note on bounded entropies
The aim of the paper is to study the link between non additivity of some
entropies and their boundedness. We propose an axiomatic construction of the
entropy relying on the fact that entropy belongs to a group isomorphic to the
usual additive group. This allows to show that the entropies that are additive
with respect to the addition of the group for independent random variables are
nonlinear transforms of the R\'enyi entropies, including the particular case of
the Shannon entropy. As a particular example, we study as a group a bounded
interval in which the addition is a generalization of the addition of
velocities in special relativity. We show that Tsallis-Havrda-Charvat entropy
is included in the family of entropies we define. Finally, a link is made
between the approach developed in the paper and the theory of deformed
logarithms.Comment: 10 pages, 1 figur
A Rate-Splitting Approach to Fading Channels with Imperfect Channel-State Information
As shown by M\'edard, the capacity of fading channels with imperfect
channel-state information (CSI) can be lower-bounded by assuming a Gaussian
channel input with power and by upper-bounding the conditional entropy
by the entropy of a Gaussian random variable with variance
equal to the linear minimum mean-square error in estimating from
. We demonstrate that, using a rate-splitting approach, this lower
bound can be sharpened: by expressing the Gaussian input as the sum of two
independent Gaussian variables and and by applying M\'edard's lower
bound first to bound the mutual information between and while
treating as noise, and by applying it a second time to the mutual
information between and while assuming to be known, we obtain a
capacity lower bound that is strictly larger than M\'edard's lower bound. We
then generalize this approach to an arbitrary number of layers, where
is expressed as the sum of independent Gaussian random variables of
respective variances , summing up to . Among
all such rate-splitting bounds, we determine the supremum over power
allocations and total number of layers . This supremum is achieved
for and gives rise to an analytically expressible capacity lower
bound. For Gaussian fading, this novel bound is shown to converge to the
Gaussian-input mutual information as the signal-to-noise ratio (SNR) grows,
provided that the variance of the channel estimation error tends to
zero as the SNR tends to infinity.Comment: 28 pages, 8 figures, submitted to IEEE Transactions on Information
Theory. Revised according to first round of review
- …