137 research outputs found

    Contourlet Domain Image Modeling and its Applications in Watermarking and Denoising

    Get PDF
    Statistical image modeling in sparse domain has recently attracted a great deal of research interest. Contourlet transform as a two-dimensional transform with multiscale and multi-directional properties is known to effectively capture the smooth contours and geometrical structures in images. The objective of this thesis is to study the statistical properties of the contourlet coefficients of images and develop statistically-based image denoising and watermarking schemes. Through an experimental investigation, it is first established that the distributions of the contourlet subband coefficients of natural images are significantly non-Gaussian with heavy-tails and they can be best described by the heavy-tailed statistical distributions, such as the alpha-stable family of distributions. It is shown that the univariate members of this family are capable of accurately fitting the marginal distributions of the empirical data and that the bivariate members can accurately characterize the inter-scale dependencies of the contourlet coefficients of an image. Based on the modeling results, a new method in image denoising in the contourlet domain is proposed. The Bayesian maximum a posteriori and minimum mean absolute error estimators are developed to determine the noise-free contourlet coefficients of grayscale and color images. Extensive experiments are conducted using a wide variety of images from a number of databases to evaluate the performance of the proposed image denoising scheme and to compare it with that of other existing schemes. It is shown that the proposed denoising scheme based on the alpha-stable distributions outperforms these other methods in terms of the peak signal-to-noise ratio and mean structural similarity index, as well as in terms of visual quality of the denoised images. The alpha-stable model is also used in developing new multiplicative watermark schemes for grayscale and color images. Closed-form expressions are derived for the log-likelihood-based multiplicative watermark detection algorithm for grayscale images using the univariate and bivariate Cauchy members of the alpha-stable family. A multiplicative multichannel watermark detector is also designed for color images using the multivariate Cauchy distribution. Simulation results demonstrate not only the effectiveness of the proposed image watermarking schemes in terms of the invisibility of the watermark, but also the superiority of the watermark detectors in providing detection rates higher than that of the state-of-the-art schemes even for the watermarked images undergone various kinds of attacks

    Noise-Enhanced Information Systems

    Get PDF
    Noise, traditionally defined as an unwanted signal or disturbance, has been shown to play an important constructive role in many information processing systems and algorithms. This noise enhancement has been observed and employed in many physical, biological, and engineered systems. Indeed stochastic facilitation (SF) has been found critical for certain biological information functions such as detection of weak, subthreshold stimuli or suprathreshold signals through both experimental verification and analytical model simulations. In this paper, we present a systematic noise-enhanced information processing framework to analyze and optimize the performance of engineered systems. System performance is evaluated not only in terms of signal-to-noise ratio but also in terms of other more relevant metrics such as probability of error for signal detection or mean square error for parameter estimation. As an important new instance of SF, we also discuss the constructive effect of noise in associative memory recall. Potential enhancement of image processing systems via the addition of noise is discussed with important applications in biomedical image enhancement, image denoising, and classification

    Polyharmonic Smoothing Splines and the Multidimensional Wiener Filtering of Fractal-Like Signals

    Get PDF
    Motivated by the fractal-like behavior of natural images, we develop a smoothing technique that uses a regularization functional which is a fractional iterate of the Laplacian. This type of functional was initially introduced by Duchon for the approximation of nonuniformily sampled, multidimensional data. He proved that the general solution is a smoothing spline that is represented by a linear combination of radial basis functions (RBFs). Unfortunately, this is tedious to implement for images because of the poor conditioning of RBFs and their lack of decay. Here, we present a much more efficient method for the special case of a uniform grid. The key idea is to express Duchon's solution in a fractional polyharmonic B-spline basis that spans the same space as the RBFs. This allows us to derive an algorithm where the smoothing is performed by filtering in the Fourier domain. Next we prove that the above smoothing spline can be optimally tuned to provide the MMSE estimation of a fractional Brownian field corrupted by white noise. This is a strong result that not only yields the best linear filter (Wiener solution), but also the optimal interpolation space, which is not bandlimited. It also suggests a way of using the noisy data to identify the optimal parameters (order of the spline and smoothing strength), which yields a fully automatic smoothing procedure. We evaluate the performance of our algorithm by comparing it against an oracle Wiener filter, which requires the knowledge of the true noiseless power spectrum of the signal. We find that our approach performs almost as well as the oracle solution over a wide range of conditions

    Image Restoration

    Get PDF
    This book represents a sample of recent contributions of researchers all around the world in the field of image restoration. The book consists of 15 chapters organized in three main sections (Theory, Applications, Interdisciplinarity). Topics cover some different aspects of the theory of image restoration, but this book is also an occasion to highlight some new topics of research related to the emergence of some original imaging devices. From this arise some real challenging problems related to image reconstruction/restoration that open the way to some new fundamental scientific questions closely related with the world we interact with

    Adaptive filtering techniques for acquisition noise and coding artifacts of digital pictures

    Get PDF
    The quality of digital pictures is often degraded by various processes (e.g, acquisition or capturing, compression, filtering process, transmission, etc). In digital image/video processing systems, random noise appearing in images is mainly generated during the capturing process; while the artifacts (or distortions) are generated in compression or filtering processes. This dissertation looks at digital image/video quality degradations with possible solution for post processing techniques for coding artifacts and acquisition noise reduction for images/videos. Three major issues associated with the image/video degradation are addressed in this work. The first issue is the temporal fluctuation artifact in digitally compressed videos. In the state-of-art video coding standard, H.264/AVC, temporal fluctuations are noticeable between intra picture frames or between an intra picture frame and neighbouring inter picture frames. To resolve this problem, a novel robust statistical temporal filtering technique is proposed. It utilises a re-descending robust statistical model with outlier rejection feature to reduce the temporal fluctuations while preserving picture details and motion sharpness. PSNR and sum of square difference (SSD) show improvement of proposed filters over other benchmark filters. Even for videos contain high motion, the proposed temporal filter shows good performances in fluctuation reduction and motion clarity preservation compared with other baseline temporal filters. The second issue concerns both the spatial and temporal artifacts (e.g, blocking, ringing, and temporal fluctuation artifacts) appearing in compressed video. To address this issue, a novel joint spatial and temporal filtering framework is constructed for artifacts reduction. Both the spatial and the temporal filters employ a re-descending robust statistical model (RRSM) in the filtering processes. The robust statistical spatial filter (RSSF) reduces spatial blocking and ringing artifacts whilst the robust statistical temporal filter (RSTF) suppresses the temporal fluctuations. Performance evaluations demonstrate that the proposed joint spatio-temporal filter is superior to H.264 loop filter in terms of spatial and temporal artifacts reduction and motion clarity preservation. The third issue is random noise, commonly modeled as mixed Gaussian and impulse noise (MGIN), which appears in image/video acquisition process. An effective method to estimate MGIN is through a robust estimator, median absolute deviation normalized (MADN). The MADN estimator is used to separate the MGIN model into impulse and additive Gaussian noise portion. Based on this estimation, the proposed filtering process is composed of a modified median filter for impulse noise reduction, and a DCT transform based denoising filter for additive Gaussian noise reduction. However, this DCT based denoising filter produces temporal fluctuations for videos. To solve this problem, a temporal filter is added to the filtering process. Therefore, another joint spatio-temporal filtering scheme is built to achieve the best visual quality of denoised videos. Extensive experiments show that the proposed joint spatio-temporal filtering scheme outperforms other benchmark filters in noise and distortions suppression

    Wavelet regression using a Lévy prior model

    No full text
    This thesis is concerned with nonparametric regression and regularization. In particular, wavelet regression using a Lévy prior model is investigated. The use of this prior is motivated by the statistical properties, such as heavy-tails, common in many datasets of interest, such as those in financial time series. The Lévy process we propose captures the heavy tails of the wavelet coefficients of an unknown function. We study the Besov regularity of the wavelet coefficients and establish the connection between the parameters of the Lévy wavelet prior model and Besov spaces. At first, we gave a necessary and sufficient condition such that the realizations of the prior model fall into a certain class of Besov spaces. We show that the tempered stable distribution preserves its functional form for different time scales. We prove that this scaling behaviour can model the exponential-decay-across-scale property of the wavelet coefficients without imposing any specified structure on the coefficients’ energy. We also introduce a Lévy wavelet mixture model to capture the sparseness of the wavelet coefficients. We show that this sparse model exhibits a thresholding rule. We also study the Lévy tempered stable prior model under a Bayesian framework. For the prior specified, we gave a closed form to the posterior Lévy measure of the wavelet coefficients and estimate the hyperparameters of the prior model in both a simulation study and for the S&P 500 time series. We focus on density estimation using a penalized likelihood approach. Primarily, we study the wavelet Tsallis entropy and Fisher information and give closed-form expressions for these measures when the wavelet coefficients are driven by a tempered stable process. Then, we develop an entropic regularization based on the wavelet Tsallis entropy and show that the penalized maximum likelihood method improves the convergence of the estimates

    Noise-Enhanced Information Systems

    Full text link

    Improved time-frequency de-noising of acoustic signals for underwater detection system

    Get PDF
    The capability to communicate and perform target localization efficiently in underwater environment is important in many applications. Sound waves are more suitable for underwater communication and target localization because attenuation in water is high for electromagnetic waves. Sound waves are subjected to underwater acoustic noise (UWAN), which is either man-made or natural. Optimum signal detection in UWAN can be achieved with the knowledge of noise statistics. The assumption of Additive White Gaussian noise (AWGN) allows the use of linear correlation (LC) detector. However, the non-Gaussian nature of UWAN results in the poor performance of such detector. This research presents an empirical model of the characteristics of UWAN in shallow waters. Data was measured in Tanjung Balau, Johor, Malaysia on 5 November 2013 and the analysis results showed that the UWAN has a non-Gaussian distribution with characteristics similar to 1/f noise. A complete detection system based on the noise models consisting of a broadband hydrophone, time-frequency distribution, de-noising method, and detection is proposed. In this research, S-transform and wavelet transform were used to generate the time-frequency representation before soft thresholding with modified universal threshold estimation was applied. A Gaussian noise injection detector (GNID) was used to overcome the problem of non-Gaussianity of the UWAN, and its performance was compared with other nonlinear detectors, such as locally optimal (LO) detector, sign correlation (SC) detector, and more conventionally matched filter (MF) detector. This system was evaluated on two types of signals, namely fixed-frequency and linear frequency modulated signals. For de-noising purposes, the S-transform outperformed the wavelet transform in terms of signal-to-noise ratio and root-mean-square error at 4 dB and 3 dB, respectively. The performance of the detectors was evaluated based on the energy-to-noise ratio (ENR) to achieve detection probability of 90% and a false alarm probability of 0.01. Thus, the ENR of the GNID using S-transform denoising, LO detector, SC detector, and MF detector were 8.89 dB, 10.66 dB, 12.7dB, and 12.5 dB, respectively, for the time-varying signal. Among the four detectors, the proposed GNID achieved the best performance, whereas the LC detector showed the weakest performance in the presence of UWAN
    • …
    corecore