9 research outputs found
The information-theoretic meaning of Gagliardo--Nirenberg type inequalities
Gagliardo--Nirenberg inequalities are interpolation inequalities which were
proved independently by Gagliardo and Nirenberg in the late fifties. In recent
years, their connections with theoretic aspects of information theory and
nonlinear diffusion equations allowed to obtain some of them in optimal form,
by recovering both the sharp constants and the explicit form of the optimizers.
In this note, at the light of these recent researches, we review the main
connections between Shannon-type entropies, diffusion equations and a class of
these inequalities
On some interrelations of generalized -entropies and a generalized Fisher information, including a Cram\'er-Rao inequality
In this communication, we describe some interrelations between generalized
-entropies and a generalized version of Fisher information. In information
theory, the de Bruijn identity links the Fisher information and the derivative
of the entropy. We show that this identity can be extended to generalized
versions of entropy and Fisher information. More precisely, a generalized
Fisher information naturally pops up in the expression of the derivative of the
Tsallis entropy. This generalized Fisher information also appears as a special
case of a generalized Fisher information for estimation problems. Indeed, we
derive here a new Cram\'er-Rao inequality for the estimation of a parameter,
which involves a generalized form of Fisher information. This generalized
Fisher information reduces to the standard Fisher information as a particular
case. In the case of a translation parameter, the general Cram\'er-Rao
inequality leads to an inequality for distributions which is saturated by
generalized -Gaussian distributions. These generalized -Gaussians are
important in several areas of physics and mathematics. They are known to
maximize the -entropies subject to a moment constraint. The Cram\'er-Rao
inequality shows that the generalized -Gaussians also minimize the
generalized Fisher information among distributions with a fixed moment.
Similarly, the generalized -Gaussians also minimize the generalized Fisher
information among distributions with a given -entropy
R\'enyi Entropy Power Inequalities via Normal Transport and Rotation
Following a recent proof of Shannon's entropy power inequality (EPI), a
comprehensive framework for deriving various EPIs for the R\'enyi entropy is
presented that uses transport arguments from normal densities and a change of
variable by rotation. Simple arguments are given to recover the previously
known R\'enyi EPIs and derive new ones, by unifying a multiplicative form with
constant c and a modification with exponent {\alpha} of previous works. In
particular, for log-concave densities, we obtain a simple transportation proof
of a sharp varentropy bound.Comment: 17 page. Entropy Journal, to appea
Recommended from our members
A novel automated autism spectrum disorder detection system
Autism spectrum disorder (ASD) is a neurological and developmental disorder that begins early in childhood and lasts throughout a person’s life. Autism is influenced by both genetic and environmental factors. Lack of social interaction, communication problems, and a limited range of behaviors and interests are possible characteristics of autism in children, alongside other symptoms. Electroencephalograms provide useful information about changes in brain activity and hence are efficaciously used for diagnosis of neurological disease. Eighteen nonlinear features were extracted from EEG signals of 40 children with a diagnosis of autism spectrum disorder and 37 children with no diagnosis of neuro developmental disorder children. Feature selection was performed using Student’s t test, and Marginal Fisher Analysis was employed for data reduction. The features were ranked according to Student’s t test. The three most significant features were used to develop the autism index, while the ranked feature set was input to SVM polynomials 1, 2, and 3 for classification. The SVM polynomial 2 yielded the highest classification accuracy of 98.70% with 20 features. The developed classification system is likely to aid healthcare professionals as a diagnostic tool to detect autism. With more data, in our future work, we intend to employ deep learning models and to explore a cloud-based detection system for the detection of autism. Our study is novel, as we have analyzed all nonlinear features, and we are one of the first groups to have uniquely developed an autism (ASD) index using the extracted features
Stability in Gagliardo-Nirenberg inequalities
The purpose of this paper is to establish a quantitative and constructive
stability result for a class of subcritical Gagliardo-Nirenberg inequalities.
We develop a new strategy, in which the flow of the fast diffusion equation is
used as a tool: a stability result in the inequality is equivalent to an
improved rate of convergence to equilibrium for the flow. In both cases, the
tail behaviour plays a key role. The regularity properties of the parabolic
flow allow us to connect an improved entropy - entropy production inequality
during the initial time layer to spectral properties of a suitable linearized
problem which is relevant for the asymptotic time layer. Altogether, the
stability in the inequalities is measured by a deficit which controls in strong
norms the distance to the manifold of optimal functions