455 research outputs found
Generalized SURE for Exponential Families: Applications to Regularization
Stein's unbiased risk estimate (SURE) was proposed by Stein for the
independent, identically distributed (iid) Gaussian model in order to derive
estimates that dominate least-squares (LS). In recent years, the SURE criterion
has been employed in a variety of denoising problems for choosing
regularization parameters that minimize an estimate of the mean-squared error
(MSE). However, its use has been limited to the iid case which precludes many
important applications. In this paper we begin by deriving a SURE counterpart
for general, not necessarily iid distributions from the exponential family.
This enables extending the SURE design technique to a much broader class of
problems. Based on this generalization we suggest a new method for choosing
regularization parameters in penalized LS estimators. We then demonstrate its
superior performance over the conventional generalized cross validation
approach and the discrepancy method in the context of image deblurring and
deconvolution. The SURE technique can also be used to design estimates without
predefining their structure. However, allowing for too many free parameters
impairs the performance of the resulting estimates. To address this inherent
tradeoff we propose a regularized SURE objective. Based on this design
criterion, we derive a wavelet denoising strategy that is similar in sprit to
the standard soft-threshold approach but can lead to improved MSE performance.Comment: to appear in the IEEE Transactions on Signal Processin
Nonparametric estimation by convex programming
The problem we concentrate on is as follows: given (1) a convex compact set
in , an affine mapping , a parametric family
of probability densities and (2) i.i.d. observations
of the random variable , distributed with the density
for some (unknown) , estimate the value of a given linear form
at . For several families with no additional
assumptions on and , we develop computationally efficient estimation
routines which are minimax optimal, within an absolute constant factor. We then
apply these routines to recovering itself in the Euclidean norm.Comment: Published in at http://dx.doi.org/10.1214/08-AOS654 the Annals of
Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical
Statistics (http://www.imstat.org
- …