40 research outputs found
Information Loss in the Human Auditory System
From the eardrum to the auditory cortex, where acoustic stimuli are decoded,
there are several stages of auditory processing and transmission where
information may potentially get lost. In this paper, we aim at quantifying the
information loss in the human auditory system by using information theoretic
tools.
To do so, we consider a speech communication model, where words are uttered
and sent through a noisy channel, and then received and processed by a human
listener.
We define a notion of information loss that is related to the human word
recognition rate. To assess the word recognition rate of humans, we conduct a
closed-vocabulary intelligibility test. We derive upper and lower bounds on the
information loss. Simulations reveal that the bounds are tight and we observe
that the information loss in the human auditory system increases as the signal
to noise ratio (SNR) decreases. Our framework also allows us to study whether
humans are optimal in terms of speech perception in a noisy environment.
Towards that end, we derive optimal classifiers and compare the human and
machine performance in terms of information loss and word recognition rate. We
observe a higher information loss and lower word recognition rate for humans
compared to the optimal classifiers. In fact, depending on the SNR, the machine
classifier may outperform humans by as much as 8 dB. This implies that for the
speech-in-stationary-noise setup considered here, the human auditory system is
sub-optimal for recognizing noisy words
Guaranteed bounds on the Kullback-Leibler divergence of univariate mixtures using piecewise log-sum-exp inequalities
Information-theoretic measures such as the entropy, cross-entropy and the
Kullback-Leibler divergence between two mixture models is a core primitive in
many signal processing tasks. Since the Kullback-Leibler divergence of mixtures
provably does not admit a closed-form formula, it is in practice either
estimated using costly Monte-Carlo stochastic integration, approximated, or
bounded using various techniques. We present a fast and generic method that
builds algorithmically closed-form lower and upper bounds on the entropy, the
cross-entropy and the Kullback-Leibler divergence of mixtures. We illustrate
the versatile method by reporting on our experiments for approximating the
Kullback-Leibler divergence between univariate exponential mixtures, Gaussian
mixtures, Rayleigh mixtures, and Gamma mixtures.Comment: 20 pages, 3 figure
Error Bounds on a Mixed Entropy Inequality
Motivated by the entropy computations relevant to the evaluation of decrease
in entropy in bit reset operations, the authors investigate the deficit in an
entropic inequality involving two independent random variables, one continuous
and the other discrete. In the case where the continuous random variable is
Gaussian, we derive strong quantitative bounds on the deficit in the
inequality. More explicitly it is shown that the decay of the deficit is
sub-Gaussian with respect to the reciprocal of the standard deviation of the
Gaussian variable. What is more, up to rational terms these results are shown
to be sharp
Estimating Mixture Entropy with Pairwise Distances
Mixture distributions arise in many parametric and non-parametric settings --
for example, in Gaussian mixture models and in non-parametric estimation. It is
often necessary to compute the entropy of a mixture, but, in most cases, this
quantity has no closed-form expression, making some form of approximation
necessary. We propose a family of estimators based on a pairwise distance
function between mixture components, and show that this estimator class has
many attractive properties. For many distributions of interest, the proposed
estimators are efficient to compute, differentiable in the mixture parameters,
and become exact when the mixture components are clustered. We prove this
family includes lower and upper bounds on the mixture entropy. The Chernoff
-divergence gives a lower bound when chosen as the distance function,
with the Bhattacharyya distance providing the tightest lower bound for
components that are symmetric and members of a location family. The
Kullback-Leibler divergence gives an upper bound when used as the distance
function. We provide closed-form expressions of these bounds for mixtures of
Gaussians, and discuss their applications to the estimation of mutual
information. We then demonstrate that our bounds are significantly tighter than
well-known existing bounds using numeric simulations. This estimator class is
very useful in optimization problems involving maximization/minimization of
entropy and mutual information, such as MaxEnt and rate distortion problems.Comment: Corrects several errata in published version, in particular in
Section V (bounds on mutual information