1,391 research outputs found
Multiscale hierarchical decomposition methods for images corrupted by multiplicative noise
Recovering images corrupted by multiplicative noise is a well known
challenging task. Motivated by the success of multiscale hierarchical
decomposition methods (MHDM) in image processing, we adapt a variety of both
classical and new multiplicative noise removing models to the MHDM form. On the
basis of previous work, we further present a tight and a refined version of the
corresponding multiplicative MHDM. We discuss existence and uniqueness of
solutions for the proposed models, and additionally, provide convergence
properties. Moreover, we present a discrepancy principle stopping criterion
which prevents recovering excess noise in the multiscale reconstruction.
Through comprehensive numerical experiments and comparisons, we qualitatively
and quantitatively evaluate the validity of all proposed models for denoising
and deblurring images degraded by multiplicative noise. By construction, these
multiplicative multiscale hierarchical decomposition methods have the added
benefit of recovering many scales of an image, which can provide features of
interest beyond image denoising
Astrophysically robust systematics removal using variational inference: application to the first month of Kepler data
Space-based transit search missions such as Kepler are collecting large
numbers of stellar light curves of unprecedented photometric precision and time
coverage. However, before this scientific goldmine can be exploited fully, the
data must be cleaned of instrumental artefacts. We present a new method to
correct common-mode systematics in large ensembles of very high precision light
curves. It is based on a Bayesian linear basis model and uses shrinkage priors
for robustness, variational inference for speed, and a de-noising step based on
empirical mode decomposition to prevent the introduction of spurious noise into
the corrected light curves. After demonstrating the performance of our method
on a synthetic dataset, we apply it to the first month of Kepler data. We
compare the results, which are publicly available, to the output of the Kepler
pipeline's pre-search data conditioning, and show that the two generally give
similar results, but the light curves corrected using our approach have lower
scatter, on average, on both long and short timescales. We finish by discussing
some limitations of our method and outlining some avenues for further
development. The trend-corrected data produced by our approach are publicly
available.Comment: 15 pages, 13 figures, accepted for publication in MNRA
Learning a Dilated Residual Network for SAR Image Despeckling
In this paper, to break the limit of the traditional linear models for
synthetic aperture radar (SAR) image despeckling, we propose a novel deep
learning approach by learning a non-linear end-to-end mapping between the noisy
and clean SAR images with a dilated residual network (SAR-DRN). SAR-DRN is
based on dilated convolutions, which can both enlarge the receptive field and
maintain the filter size and layer depth with a lightweight structure. In
addition, skip connections and residual learning strategy are added to the
despeckling model to maintain the image details and reduce the vanishing
gradient problem. Compared with the traditional despeckling methods, the
proposed method shows superior performance over the state-of-the-art methods on
both quantitative and visual assessments, especially for strong speckle noise.Comment: 18 pages, 13 figures, 7 table
Efficient parallelization on GPU of an image smoothing method based on a variational model
Medical imaging is fundamental for improvements in diagnostic accuracy. However, noise frequently corrupts the images acquired, and this can lead to erroneous diagnoses. Fortunately, image preprocessing algorithms can enhance corrupted images, particularly in noise smoothing and removal. In the medical field, time is always a very critical factor, and so there is a need for implementations which are fast and, if possible, in real time. This study presents and discusses an implementation of a highly efficient algorithm for image noise smoothing based on general purpose computing on graphics processing units techniques. The use of these techniques facilitates the quick and efficient smoothing of images corrupted by noise, even when performed on large-dimensional data sets. This is particularly relevant since GPU cards are becoming more affordable, powerful and common in medical environments
Nonlocal Myriad Filters for Cauchy Noise Removal
The contribution of this paper is two-fold. First, we introduce a generalized
myriad filter, which is a method to compute the joint maximum likelihood
estimator of the location and the scale parameter of the Cauchy distribution.
Estimating only the location parameter is known as myriad filter. We propose an
efficient algorithm to compute the generalized myriad filter and prove its
convergence. Special cases of this algorithm result in the classical myriad
filtering, respective an algorithm for estimating only the scale parameter.
Based on an asymptotic analysis, we develop a second, even faster generalized
myriad filtering technique.
Second, we use our new approaches within a nonlocal, fully unsupervised
method to denoise images corrupted by Cauchy noise. Special attention is paid
to the determination of similar patches in noisy images. Numerical examples
demonstrate the excellent performance of our algorithms which have moreover the
advantage to be robust with respect to the parameter choice
- …