6,872 research outputs found
A de Bruijn identity for symmetric stable laws
We show how some attractive information--theoretic properties of Gaussians
pass over to more general families of stable densities. We define a new score
function for symmetric stable laws, and use it to give a stable version of the
heat equation. Using this, we derive a version of the de Bruijn identity,
allowing us to write the derivative of relative entropy as an inner product of
score functions. We discuss maximum entropy properties of symmetric stable
densities
Renyi entropy and improved equilibration rates to self-similarity for nonlinear diffusion equations
We investigate the large-time asymptotics of nonlinear diffusion equations
in dimension , in the exponent interval , when the initial datum is of bounded second moment. Precise
rates of convergence to the Barenblatt profile in terms of the relative R\'enyi
entropy are demonstrated for finite-mass solutions defined in the whole space
when they are re-normalized at each time with respect to their own
second moment. The analysis shows that the relative R\'enyi entropy exhibits a
better decay, for intermediate times, with respect to the standard
Ralston-Newton entropy. The result follows by a suitable use of the so-called
concavity of R\'enyi entropy power
The conditional entropy power inequality for quantum additive noise channels
We prove the quantum conditional Entropy Power Inequality for quantum
additive noise channels. This inequality lower bounds the quantum conditional
entropy of the output of an additive noise channel in terms of the quantum
conditional entropies of the input state and the noise when they are
conditionally independent given the memory. We also show that this conditional
Entropy Power Inequality is optimal in the sense that we can achieve equality
asymptotically by choosing a suitable sequence of Gaussian input states. We
apply the conditional Entropy Power Inequality to find an array of
information-theoretic inequalities for conditional entropies which are the
analogues of inequalities which have already been established in the
unconditioned setting. Furthermore, we give a simple proof of the convergence
rate of the quantum Ornstein-Uhlenbeck semigroup based on Entropy Power
Inequalities.Comment: 26 pages; updated to match published versio
Gaussian States Minimize the Output Entropy of the One-Mode Quantum Attenuator
We prove that Gaussian thermal input states minimize the output von Neumann
entropy of the one-mode Gaussian quantum-limited attenuator for fixed input
entropy. The Gaussian quantum-limited attenuator models the attenuation of an
electromagnetic signal in the quantum regime. The Shannon entropy of an
attenuated real-valued classical signal is a simple function of the entropy of
the original signal. A striking consequence of energy quantization is that the
output von Neumann entropy of the quantum-limited attenuator is no more a
function of the input entropy alone. The proof starts from the majorization
result of De Palma et al., IEEE Trans. Inf. Theory 62, 2895 (2016), and is
based on a new isoperimetric inequality. Our result implies that geometric
input probability distributions minimize the output Shannon entropy of the
thinning for fixed input entropy. Moreover, our result opens the way to the
multimode generalization, that permits to determine both the triple trade-off
region of the Gaussian quantum-limited attenuator and the classical capacity
region of the Gaussian degraded quantum broadcast channel
Information-Theoretic Capacity and Error Exponents of Stationary Point Processes under Random Additive Displacements
This paper studies the Shannon regime for the random displacement of
stationary point processes. Let each point of some initial stationary point
process in give rise to one daughter point, the location of which is
obtained by adding a random vector to the coordinates of the mother point, with
all displacement vectors independently and identically distributed for all
points. The decoding problem is then the following one: the whole mother point
process is known as well as the coordinates of some daughter point; the
displacements are only known through their law; can one find the mother of this
daughter point? The Shannon regime is that where the dimension tends to
infinity and where the logarithm of the intensity of the point process is
proportional to . We show that this problem exhibits a sharp threshold: if
the sum of the proportionality factor and of the differential entropy rate of
the noise is positive, then the probability of finding the right mother point
tends to 0 with for all point processes and decoding strategies. If this
sum is negative, there exist mother point processes, for instance Poisson, and
decoding strategies, for instance maximum likelihood, for which the probability
of finding the right mother tends to 1 with . We then use large deviations
theory to show that in the latter case, if the entropy spectrum of the noise
satisfies a large deviation principle, then the error probability goes
exponentially fast to 0 with an exponent that is given in closed form in terms
of the rate function of the noise entropy spectrum. This is done for two
classes of mother point processes: Poisson and Mat\'ern. The practical interest
to information theory comes from the explicit connection that we also establish
between this problem and the estimation of error exponents in Shannon's
additive noise channel with power constraints on the codewords
- …