1,848 research outputs found
Image denoising with multi-layer perceptrons, part 1: comparison with existing algorithms and with bounds
Image denoising can be described as the problem of mapping from a noisy image
to a noise-free image. The best currently available denoising methods
approximate this mapping with cleverly engineered algorithms. In this work we
attempt to learn this mapping directly with plain multi layer perceptrons (MLP)
applied to image patches. We will show that by training on large image
databases we are able to outperform the current state-of-the-art image
denoising methods. In addition, our method achieves results that are superior
to one type of theoretical bound and goes a large way toward closing the gap
with a second type of theoretical bound. Our approach is easily adapted to less
extensively studied types of noise, such as mixed Poisson-Gaussian noise, JPEG
artifacts, salt-and-pepper noise and noise resembling stripes, for which we
achieve excellent results as well. We will show that combining a block-matching
procedure with MLPs can further improve the results on certain images. In a
second paper, we detail the training trade-offs and the inner mechanisms of our
MLPs
Image Denoising in Mixed Poisson-Gaussian Noise
We propose a general methodology (PURE-LET) to design and optimize a wide class of transform-domain thresholding algorithms for denoising images corrupted by mixed Poisson-Gaussian noise. We express the denoising process as a linear expansion of thresholds (LET) that we optimize by relying on a purely data-adaptive unbiased estimate of the mean-squared error (MSE), derived in a non-Bayesian framework (PURE: Poisson-Gaussian unbiased risk estimate). We provide a practical approximation of this theoretical MSE estimate for the tractable optimization of arbitrary transform-domain thresholding. We then propose a pointwise estimator for undecimated filterbank transforms, which consists of subband-adaptive thresholding functions with signal-dependent thresholds that are globally optimized in the image domain. We finally demonstrate the potential of the proposed approach through extensive comparisons with state-of-the-art techniques that are specifically tailored to the estimation of Poisson intensities. We also present denoising results obtained on real images of low-count fluorescence microscopy
Optimally Stabilized PET Image Denoising Using Trilateral Filtering
Low-resolution and signal-dependent noise distribution in positron emission
tomography (PET) images makes denoising process an inevitable step prior to
qualitative and quantitative image analysis tasks. Conventional PET denoising
methods either over-smooth small-sized structures due to resolution limitation
or make incorrect assumptions about the noise characteristics. Therefore,
clinically important quantitative information may be corrupted. To address
these challenges, we introduced a novel approach to remove signal-dependent
noise in the PET images where the noise distribution was considered as
Poisson-Gaussian mixed. Meanwhile, the generalized Anscombe's transformation
(GAT) was used to stabilize varying nature of the PET noise. Other than noise
stabilization, it is also desirable for the noise removal filter to preserve
the boundaries of the structures while smoothing the noisy regions. Indeed, it
is important to avoid significant loss of quantitative information such as
standard uptake value (SUV)-based metrics as well as metabolic lesion volume.
To satisfy all these properties, we extended bilateral filtering method into
trilateral filtering through multiscaling and optimal Gaussianization process.
The proposed method was tested on more than 50 PET-CT images from various
patients having different cancers and achieved the superior performance
compared to the widely used denoising techniques in the literature.Comment: 8 pages, 3 figures; to appear in the Lecture Notes in Computer
Science (MICCAI 2014
- …