2 research outputs found
MMSE Bounds Under Kullback-Leibler Divergence Constraints on the Joint Input-Output Distribution
This paper proposes a new family of lower and upper bounds on the minimum
mean squared error (MMSE). The key idea is to minimize/maximize the MMSE
subject to the constraint that the joint distribution of the input-output
statistics lies in a Kullback-Leibler divergence ball centered at some Gaussian
reference distribution. Both bounds are tight and are attained by Gaussian
distributions whose mean is identical to that of the reference distribution and
whose covariance matrix is determined by a scalar parameter that can be
obtained by finding the root of a monotonic function. The upper bound
corresponds to a minimax optimal estimator and provides performance guarantees
under distributional uncertainty. The lower bound provides an alternative to
well-known inequalities in estimation theory, such as the Cram\'er-Rao bound,
that is potentially tighter and defined for a larger class of distributions.
Examples of applications in signal processing and information theory illustrate
the usefulness of the proposed bounds in practice.Comment: Submitted for publication in the IEEE Transactions on Signal
Processin
A Cram\'er-Rao Type Bound for Bayesian Risk with Bregman Loss
A general class of Bayesian lower bounds when the underlying loss function is
a Bregman divergence is demonstrated. This class can be considered as an
extension of the Weinstein--Weiss family of bounds for the mean squared error
and relies on finding a variational characterization of Bayesian risk. The
approach allows for the derivation of a version of the Cram\'er--Rao bound that
is specific to a given Bregman divergence. The new generalization of the
Cram\'er--Rao bound reduces to the classical one when the loss function is
taken to be the Euclidean norm. The effectiveness of the new bound is evaluated
in the Poisson noise setting and the Binomial noise setting.Comment: This version contains some new example