13,740 research outputs found
The information-theoretic meaning of Gagliardo--Nirenberg type inequalities
Gagliardo--Nirenberg inequalities are interpolation inequalities which were
proved independently by Gagliardo and Nirenberg in the late fifties. In recent
years, their connections with theoretic aspects of information theory and
nonlinear diffusion equations allowed to obtain some of them in optimal form,
by recovering both the sharp constants and the explicit form of the optimizers.
In this note, at the light of these recent researches, we review the main
connections between Shannon-type entropies, diffusion equations and a class of
these inequalities
Subadditivity of Matrix phi-Entropy and Concentration of Random Matrices
Matrix concentration inequalities provide a direct way to bound the typical
spectral norm of a random matrix. The methods for establishing these results
often parallel classical arguments, such as the Laplace transform method. This
work develops a matrix extension of the entropy method, and it applies these
ideas to obtain some matrix concentration inequalities.Comment: 23 page
Bounds on the deficit in the logarithmic Sobolev inequality
The de cit in the logarithmic Sobolev inequality for the Gaussian measure is
considered and estimated by means of transport and information-theoretic
distances
A Unifying Variational Perspective on Some Fundamental Information Theoretic Inequalities
This paper proposes a unifying variational approach for proving and extending
some fundamental information theoretic inequalities. Fundamental information
theory results such as maximization of differential entropy, minimization of
Fisher information (Cram\'er-Rao inequality), worst additive noise lemma,
entropy power inequality (EPI), and extremal entropy inequality (EEI) are
interpreted as functional problems and proved within the framework of calculus
of variations. Several applications and possible extensions of the proposed
results are briefly mentioned
- …