749 research outputs found

    The curvelet transform for image denoising

    Get PDF
    We describe approximate digital implementations of two new mathematical transforms, namely, the ridgelet transform and the curvelet transform. Our implementations offer exact reconstruction, stability against perturbations, ease of implementation, and low computational complexity. A central tool is Fourier-domain computation of an approximate digital Radon transform. We introduce a very simple interpolation in the Fourier space which takes Cartesian samples and yields samples on a rectopolar grid, which is a pseudo-polar sampling set based on a concentric squares geometry. Despite the crudeness of our interpolation, the visual performance is surprisingly good. Our ridgelet transform applies to the Radon transform a special overcomplete wavelet pyramid whose wavelets have compact support in the frequency domain. Our curvelet transform uses our ridgelet transform as a component step, and implements curvelet subbands using a filter bank of a` trous wavelet filters. Our philosophy throughout is that transforms should be overcomplete, rather than critically sampled. We apply these digital transforms to the denoising of some standard images embedded in white noise. In the tests reported here, simple thresholding of the curvelet coefficients is very competitive with "state of the art" techniques based on wavelets, including thresholding of decimated or undecimated wavelet transforms and also including tree-based Bayesian posterior mean methods. Moreover, the curvelet reconstructions exhibit higher perceptual quality than wavelet-based reconstructions, offering visually sharper images and, in particular, higher quality recovery of edges and of faint linear and curvilinear features. Existing theory for curvelet and ridgelet transforms suggests that these new approaches can outperform wavelet methods in certain image reconstruction problems. The empirical results reported here are in encouraging agreement

    On the energy leakage of discrete wavelet transform

    Get PDF
    The energy leakage is an inherent deficiency of discrete wavelet transform (DWT) which is often ignored by researchers and practitioners. In this paper, a systematic investigation into the energy leakage is reported. The DWT is briefly introduced first, and then the energy leakage phenomenon is described using a numerical example as an illustration and its effect on the DWT results is discussed. Focusing on the Daubechies wavelet functions, the band overlap between the quadrature mirror analysis filters was studied and the results reveal that there is an unavoidable tradeoff between the band overlap degree and the time resolution for the DWT. The dependency of the energy leakage to the wavelet function order was studied by using a criterion defined to evaluate the severity of the energy leakage. In addition, a method based on resampling technique was proposed to relieve the effects of the energy leakage. The effectiveness of the proposed method has been validated by numerical simulation study and experimental study

    Cosmological constraints from the capture of non-Gaussianity in Weak Lensing data

    Full text link
    Weak gravitational lensing has become a common tool to constrain the cosmological model. The majority of the methods to derive constraints on cosmological parameters use second-order statistics of the cosmic shear. Despite their success, second-order statistics are not optimal and degeneracies between some parameters remain. Tighter constraints can be obtained if second-order statistics are combined with a statistic that is efficient to capture non-Gaussianity. In this paper, we search for such a statistical tool and we show that there is additional information to be extracted from statistical analysis of the convergence maps beyond what can be obtained from statistical analysis of the shear field. For this purpose, we have carried out a large number of cosmological simulations along the {\sigma}8-{\Omega}m degeneracy, and we have considered three different statistics commonly used for non-Gaussian features characterization: skewness, kurtosis and peak count. To be able to investigate non-Gaussianity directly in the shear field we have used the aperture mass definition of these three statistics for different scales. Then, the results have been compared with the results obtained with the same statistics estimated in the convergence maps at the same scales. First, we show that shear statistics give similar constraints to those given by convergence statistics, if the same scale is considered. In addition, we find that the peak count statistic is the best to capture non-Gaussianities in the weak lensing field and to break the {\sigma}8-{\Omega}m degeneracy. We show that this statistical analysis should be conducted in the convergence maps: first, because there exist fast algorithms to compute the convergence map for different scales, and secondly because it offers the opportunity to denoise the reconstructed convergence map, which improves non-Gaussian features extraction.Comment: Accepted for publication in MNRAS (11 pages, 5 figures, 9 tables

    SPECKLE NOISE REDUCTION USING ADAPTIVE MULTISCALE PRODUCTS THRESHOLDING

    Get PDF
    Image denoising is an essential preprocessing technique in image acquisition systems. For instance, in ultrasound (US) images, suppression of speckle noise while preserving the edges is highly preferred. Thus, in this paper denoising the speckle noise by using wavelet-based multiscale product thresholding approach is presented. The underlying principle of this technique is to apply dyadic wavelet transform and performs the multiscale products of the wavelet transform. Then, an adaptive threshold is calculated and applied to the multiscale products instead of applying it on wavelet coefficient. Thereafter, the performance of the proposed technique is compared with other denoising techniques such as Lee filter, boxcar filter, linear minimum mean square error (LMMSE) filter and median filter. The result shows that the proposed technique gives a better performance in terms of PNSR and ENL value by an average gain of 1.22 and 1.8 times the noisy on, respectively and can better preserved image detail
    • …
    corecore