29,732 research outputs found
Post-Reconstruction Deconvolution of PET Images by Total Generalized Variation Regularization
Improving the quality of positron emission tomography (PET) images, affected
by low resolution and high level of noise, is a challenging task in nuclear
medicine and radiotherapy. This work proposes a restoration method, achieved
after tomographic reconstruction of the images and targeting clinical
situations where raw data are often not accessible. Based on inverse problem
methods, our contribution introduces the recently developed total generalized
variation (TGV) norm to regularize PET image deconvolution. Moreover, we
stabilize this procedure with additional image constraints such as positivity
and photometry invariance. A criterion for updating and adjusting automatically
the regularization parameter in case of Poisson noise is also presented.
Experiments are conducted on both synthetic data and real patient images.Comment: First published in the Proceedings of the 23rd European Signal
Processing Conference (EUSIPCO-2015) in 2015, published by EURASI
A proximal iteration for deconvolving Poisson noisy images using sparse representations
We propose an image deconvolution algorithm when the data is contaminated by
Poisson noise. The image to restore is assumed to be sparsely represented in a
dictionary of waveforms such as the wavelet or curvelet transforms. Our key
contributions are: First, we handle the Poisson noise properly by using the
Anscombe variance stabilizing transform leading to a {\it non-linear}
degradation equation with additive Gaussian noise. Second, the deconvolution
problem is formulated as the minimization of a convex functional with a
data-fidelity term reflecting the noise properties, and a non-smooth
sparsity-promoting penalties over the image representation coefficients (e.g.
-norm). Third, a fast iterative backward-forward splitting algorithm is
proposed to solve the minimization problem. We derive existence and uniqueness
conditions of the solution, and establish convergence of the iterative
algorithm. Finally, a GCV-based model selection procedure is proposed to
objectively select the regularization parameter. Experimental results are
carried out to show the striking benefits gained from taking into account the
Poisson statistics of the noise. These results also suggest that using
sparse-domain regularization may be tractable in many deconvolution
applications with Poisson noise such as astronomy and microscopy
Photon-Efficient Computational 3D and Reflectivity Imaging with Single-Photon Detectors
Capturing depth and reflectivity images at low light levels from active
illumination of a scene has wide-ranging applications. Conventionally, even
with single-photon detectors, hundreds of photon detections are needed at each
pixel to mitigate Poisson noise. We develop a robust method for estimating
depth and reflectivity using on the order of 1 detected photon per pixel
averaged over the scene. Our computational imager combines physically accurate
single-photon counting statistics with exploitation of the spatial correlations
present in real-world reflectivity and 3D structure. Experiments conducted in
the presence of strong background light demonstrate that our computational
imager is able to accurately recover scene depth and reflectivity, while
traditional maximum-likelihood based imaging methods lead to estimates that are
highly noisy. Our framework increases photon efficiency 100-fold over
traditional processing and also improves, somewhat, upon first-photon imaging
under a total acquisition time constraint in raster-scanned operation. Thus our
new imager will be useful for rapid, low-power, and noise-tolerant active
optical imaging, and its fixed dwell time will facilitate parallelization
through use of a detector array.Comment: 11 pages, 8 figure
- …