3,481 research outputs found
Fundamental properties of Tsallis relative entropy
Fundamental properties for the Tsallis relative entropy in both classical and
quantum systems are studied. As one of our main results, we give the parametric
extension of the trace inequality between the quantum relative entropy and the
minus of the trace of the relative operator entropy given by Hiai and Petz. The
monotonicity of the quantum Tsallis relative entropy for the trace preserving
completely positive linear map is also shown without the assumption that the
density operators are invertible.
The generalized Tsallis relative entropy is defined and its subadditivity is
shown by its joint convexity. Moreover, the generalized Peierls-Bogoliubov
inequality is also proven
Two remarks on generalized entropy power inequalities
This note contributes to the understanding of generalized entropy power
inequalities. Our main goal is to construct a counter-example regarding
monotonicity and entropy comparison of weighted sums of independent identically
distributed log-concave random variables. We also present a complex analogue of
a recent dependent entropy power inequality of Hao and Jog, and give a very
simple proof.Comment: arXiv:1811.00345 is split into 2 papers, with this being on
Conditional R\'enyi entropy and the relationships between R\'enyi capacities
The analogues of Arimoto's definition of conditional R\'enyi entropy and
R\'enyi mutual information are explored for abstract alphabets. These
quantities, although dependent on the reference measure, have some useful
properties similar to those known in the discrete setting. In addition to
laying out some such basic properties and the relations to R\'enyi divergences,
the relationships between the families of mutual informations defined by
Sibson, Augustin-Csisz\'ar, and Lapidoth-Pfister, as well as the corresponding
capacities, are explored.Comment: 17 pages, 1 figur
The information-theoretic meaning of Gagliardo--Nirenberg type inequalities
Gagliardo--Nirenberg inequalities are interpolation inequalities which were
proved independently by Gagliardo and Nirenberg in the late fifties. In recent
years, their connections with theoretic aspects of information theory and
nonlinear diffusion equations allowed to obtain some of them in optimal form,
by recovering both the sharp constants and the explicit form of the optimizers.
In this note, at the light of these recent researches, we review the main
connections between Shannon-type entropies, diffusion equations and a class of
these inequalities
A note on operator inequalities of Tsallis relative operator entropy
Tsallis relative operator entropy was defined as a parametric extension of
relative operator entropy and the generalized Shannon inequalities were shown
in the previous paper. After the review of some fundamental properties of
Tsallis relative operator entropy, some operator inequalities related to
Tsallis relative operator entropy are shown in the present paper. Our
inequalities give the upper and lower bounds of Tsallis relative operator
entropy. The operator equality on Tsallis relative operator entropy is also
shown by considering the tensor product. This relation generalizes the
pseudoadditivity for Tsallis entropy. As a corollary of our operator equality
derived from the tensor product manipulation, we show several operator
inequalities including the superadditivity and the subadditivity for Tsallis
relative operator entropy. Our results are generalizations of the
superadditivity and the subadditivity for Tsallis entropy.Comment: to appear in Linear Algebra App
Heat equation and convolution inequalities
It is known that many classical inequalities linked to convolutions can be
obtained by looking at the monotonicity in time of convolutions of powers of
solutions to the heat equation, provided that both the exponents and the
coefficients of diffusions are suitably chosen and related. This idea can be
applied to give an alternative proof of the sharp form of the classical Young's
inequality and its converse, to Brascamp--Lieb type inequalities, Babenko's
inequality and Pr\'ekopa--Leindler inequality as well as the Shannon's entropy
power inequality. This note aims in presenting new proofs of these results, in
the spirit of the original arguments introduced by Stam to prove the entropy
power inequality.Comment: 29 page
Information Theoretic Proofs of Entropy Power Inequalities
While most useful information theoretic inequalities can be deduced from the
basic properties of entropy or mutual information, up to now Shannon's entropy
power inequality (EPI) is an exception: Existing information theoretic proofs
of the EPI hinge on representations of differential entropy using either Fisher
information or minimum mean-square error (MMSE), which are derived from de
Bruijn's identity. In this paper, we first present an unified view of these
proofs, showing that they share two essential ingredients: 1) a data processing
argument applied to a covariance-preserving linear transformation; 2) an
integration over a path of a continuous Gaussian perturbation. Using these
ingredients, we develop a new and brief proof of the EPI through a mutual
information inequality, which replaces Stam and Blachman's Fisher information
inequality (FII) and an inequality for MMSE by Guo, Shamai and Verd\'u used in
earlier proofs. The result has the advantage of being very simple in that it
relies only on the basic properties of mutual information. These ideas are then
generalized to various extended versions of the EPI: Zamir and Feder's
generalized EPI for linear transformations of the random variables, Takano and
Johnson's EPI for dependent variables, Liu and Viswanath's
covariance-constrained EPI, and Costa's concavity inequality for the entropy
power.Comment: submitted for publication in the IEEE Transactions on Information
Theory, revised versio
- …