1,787 research outputs found
Fast Poisson Noise Removal by Biorthogonal Haar Domain Hypothesis Testing
Methods based on hypothesis tests (HTs) in the Haar domain are widely used to
denoise Poisson count data. Facing large datasets or real-time applications,
Haar-based denoisers have to use the decimated transform to meet limited-memory
or computation-time constraints. Unfortunately, for regular underlying
intensities, decimation yields discontinuous estimates and strong "staircase"
artifacts. In this paper, we propose to combine the HT framework with the
decimated biorthogonal Haar (Bi-Haar) transform instead of the classical Haar.
The Bi-Haar filter bank is normalized such that the p-values of Bi-Haar
coefficients (pBH) provide good approximation to those of Haar (pH) for
high-intensity settings or large scales; for low-intensity settings and small
scales, we show that pBH are essentially upper-bounded by pH. Thus, we may
apply the Haar-based HTs to Bi-Haar coefficients to control a prefixed false
positive rate. By doing so, we benefit from the regular Bi-Haar filter bank to
gain a smooth estimate while always maintaining a low computational complexity.
A Fisher-approximation-based threshold imple- menting the HTs is also
established. The efficiency of this method is illustrated on an example of
hyperspectral-source-flux estimation
Source detection using a 3D sparse representation: application to the Fermi gamma-ray space telescope
The multiscale variance stabilization Transform (MSVST) has recently been
proposed for Poisson data denoising. This procedure, which is nonparametric, is
based on thresholding wavelet coefficients. We present in this paper an
extension of the MSVST to 3D data (in fact 2D-1D data) when the third dimension
is not a spatial dimension, but the wavelength, the energy, or the time. We
show that the MSVST can be used for detecting and characterizing astrophysical
sources of high-energy gamma rays, using realistic simulated observations with
the Large Area Telescope (LAT). The LAT was launched in June 2008 on the Fermi
Gamma-ray Space Telescope mission. The MSVST algorithm is very fast relative to
traditional likelihood model fitting, and permits efficient detection across
the time dimension and immediate estimation of spectral properties.
Astrophysical sources of gamma rays, especially active galaxies, are typically
quite variable, and our current work may lead to a reliable method to quickly
characterize the flaring properties of newly-detected sources.Comment: Accepted. Full paper will figures available at
http://jstarck.free.fr/aa08_msvst.pd
Skellam shrinkage: Wavelet-based intensity estimation for inhomogeneous Poisson data
The ubiquity of integrating detectors in imaging and other applications
implies that a variety of real-world data are well modeled as Poisson random
variables whose means are in turn proportional to an underlying vector-valued
signal of interest. In this article, we first show how the so-called Skellam
distribution arises from the fact that Haar wavelet and filterbank transform
coefficients corresponding to measurements of this type are distributed as sums
and differences of Poisson counts. We then provide two main theorems on Skellam
shrinkage, one showing the near-optimality of shrinkage in the Bayesian setting
and the other providing for unbiased risk estimation in a frequentist context.
These results serve to yield new estimators in the Haar transform domain,
including an unbiased risk estimate for shrinkage of Haar-Fisz
variance-stabilized data, along with accompanying low-complexity algorithms for
inference. We conclude with a simulation study demonstrating the efficacy of
our Skellam shrinkage estimators both for the standard univariate wavelet test
functions as well as a variety of test images taken from the image processing
literature, confirming that they offer substantial performance improvements
over existing alternatives.Comment: 27 pages, 8 figures, slight formatting changes; submitted for
publicatio
Generalized SURE for Exponential Families: Applications to Regularization
Stein's unbiased risk estimate (SURE) was proposed by Stein for the
independent, identically distributed (iid) Gaussian model in order to derive
estimates that dominate least-squares (LS). In recent years, the SURE criterion
has been employed in a variety of denoising problems for choosing
regularization parameters that minimize an estimate of the mean-squared error
(MSE). However, its use has been limited to the iid case which precludes many
important applications. In this paper we begin by deriving a SURE counterpart
for general, not necessarily iid distributions from the exponential family.
This enables extending the SURE design technique to a much broader class of
problems. Based on this generalization we suggest a new method for choosing
regularization parameters in penalized LS estimators. We then demonstrate its
superior performance over the conventional generalized cross validation
approach and the discrepancy method in the context of image deblurring and
deconvolution. The SURE technique can also be used to design estimates without
predefining their structure. However, allowing for too many free parameters
impairs the performance of the resulting estimates. To address this inherent
tradeoff we propose a regularized SURE objective. Based on this design
criterion, we derive a wavelet denoising strategy that is similar in sprit to
the standard soft-threshold approach but can lead to improved MSE performance.Comment: to appear in the IEEE Transactions on Signal Processin
Astronomical Data Analysis and Sparsity: from Wavelets to Compressed Sensing
Wavelets have been used extensively for several years now in astronomy for
many purposes, ranging from data filtering and deconvolution, to star and
galaxy detection or cosmic ray removal. More recent sparse representations such
ridgelets or curvelets have also been proposed for the detection of anisotropic
features such cosmic strings in the cosmic microwave background.
We review in this paper a range of methods based on sparsity that have been
proposed for astronomical data analysis. We also discuss what is the impact of
Compressed Sensing, the new sampling theory, in astronomy for collecting the
data, transferring them to the earth or reconstructing an image from incomplete
measurements.Comment: Submitted. Full paper will figures available at
http://jstarck.free.fr/IEEE09_SparseAstro.pd
- …