356 research outputs found

    Denoising techniques - a comparison

    Get PDF
    Visual information transmitted in the form of digital images is becoming a major method of communication in the modern age, but the image obtained after transmission is often corrupted with noise. The received image needs processing before it can be used in applications. Image denoising involves the manipulation of the image data to produce a visually high quality image. This thesis reviews the existing denoising algorithms, such as filtering approach, wavelet based approach, and multifractal approach, and performs their comparative study. Different noise models including additive and multiplicative types are used. They include Gaussian noise, salt and pepper noise, speckle noise and Brownian noise. Selection of the denoising algorithm is application dependent. Hence, it is necessary to have knowledge about the noise present in the image so as to select the appropriate denoising algorithm. The filtering approach has been proved to be the best when the image is corrupted with salt and pepper noise. The wavelet based approach finds applications in denoising images corrupted with Gaussian noise. In the case where the noise characteristics are complex, the multifractal approach can be used. A quantitative measure of comparison is provided by the signal to noise ratio of the image

    Polyharmonic Smoothing Splines and the Multidimensional Wiener Filtering of Fractal-Like Signals

    Get PDF
    Motivated by the fractal-like behavior of natural images, we develop a smoothing technique that uses a regularization functional which is a fractional iterate of the Laplacian. This type of functional was initially introduced by Duchon for the approximation of nonuniformily sampled, multidimensional data. He proved that the general solution is a smoothing spline that is represented by a linear combination of radial basis functions (RBFs). Unfortunately, this is tedious to implement for images because of the poor conditioning of RBFs and their lack of decay. Here, we present a much more efficient method for the special case of a uniform grid. The key idea is to express Duchon's solution in a fractional polyharmonic B-spline basis that spans the same space as the RBFs. This allows us to derive an algorithm where the smoothing is performed by filtering in the Fourier domain. Next we prove that the above smoothing spline can be optimally tuned to provide the MMSE estimation of a fractional Brownian field corrupted by white noise. This is a strong result that not only yields the best linear filter (Wiener solution), but also the optimal interpolation space, which is not bandlimited. It also suggests a way of using the noisy data to identify the optimal parameters (order of the spline and smoothing strength), which yields a fully automatic smoothing procedure. We evaluate the performance of our algorithm by comparing it against an oracle Wiener filter, which requires the knowledge of the true noiseless power spectrum of the signal. We find that our approach performs almost as well as the oracle solution over a wide range of conditions

    Wavelet Theory

    Get PDF
    The wavelet is a powerful mathematical tool that plays an important role in science and technology. This book looks at some of the most creative and popular applications of wavelets including biomedical signal processing, image processing, communication signal processing, Internet of Things (IoT), acoustical signal processing, financial market data analysis, energy and power management, and COVID-19 pandemic measurements and calculations. The editor’s personal interest is the application of wavelet transform to identify time domain changes on signals and corresponding frequency components and in improving power amplifier behavior

    Revisiting QRS detection methodologies for portable, wearable, battery-operated, and wireless ECG systems

    Get PDF
    Cardiovascular diseases are the number one cause of death worldwide. Currently, portable battery-operated systems such as mobile phones with wireless ECG sensors have the potential to be used in continuous cardiac function assessment that can be easily integrated into daily life. These portable point-of-care diagnostic systems can therefore help unveil and treat cardiovascular diseases. The basis for ECG analysis is a robust detection of the prominent QRS complex, as well as other ECG signal characteristics. However, it is not clear from the literature which ECG analysis algorithms are suited for an implementation on a mobile device. We investigate current QRS detection algorithms based on three assessment criteria: 1) robustness to noise, 2) parameter choice, and 3) numerical efficiency, in order to target a universal fast-robust detector. Furthermore, existing QRS detection algorithms may provide an acceptable solution only on small segments of ECG signals, within a certain amplitude range, or amid particular types of arrhythmia and/or noise. These issues are discussed in the context of a comparison with the most conventional algorithms, followed by future recommendations for developing reliable QRS detection schemes suitable for implementation on battery-operated mobile devices.Mohamed Elgendi, BjΓΆrn Eskofier, Socrates Dokos, Derek Abbot

    Topics on Multiresolution Signal Processing and Bayesian Modeling with Applications in Bioinformatics

    Get PDF
    Analysis of multi-resolution signals and time-series data has wide applications in biology, medicine, engineering, etc. In many cases, the large-scale (low-frequency) features of a signal including basic descriptive statistics, trends, smoothed functional estimates, do not carry useful information about the phenomenon of interest. On the other hand, the study of small-scale (high-frequency) features that look like noise may be more informative even though extracting such informative features are not always straightforward. In this dissertation we try to address some of the issues pertaining to high-frequency features extraction and denoising of noisy signals. Another topic studied in this dissertation is focused on the integration of genome data with transatlantic voyage data of enslaved people from Africa to determine the ancestry origin of Afro-Americans. Chapter 2. Assessment of Scaling by Auto-Correlation Shells. In this chapter, we utilize the Auto-Correlation (AC) Shell to propose a feature extraction method that can effectively capture small-scale information of a signal. The AC Shell is a redundant shift-invariant and symmetric representation of the signal that is obtained by using Auto-Correlation function of compactly supported wavelets. The small-scale features are extracted by computing the energy of AC Shell coefficients at different levels of decomposition as well as the slope of the line fitted to these energy values using AC Shell spectra. We discuss the theoretical properties and verify them using extensive simulations. We compare the extracted features from AC Shell with those of Wavelets in terms of bias, variance, and mean square error (MSE). The results indicate that the AC Shell features tend to have smaller variance, hence more reliable. Moreover, to show its effectiveness, we validate our feature extraction method in the context of classification to identify patients with ovarian cancer through the analysis of their blood mass spectrum. For this study, we use the features extracted by AC Shell spectra along with a support vector machine classifier to distinguish control from cancer cases. Chapter 3. Bayesian Binary Regressions in Wavelet-based Function Estimation. Wavelet shrinkage has been widely used in nonparametric statistics and signal processing for a variety of purposes including denoising noisy signals and images, dimension reduction, and variable/feature selection. Although the traditional wavelet shrinkage methods are effective and popular, they have one major drawback. In these methods the shrinkage process only relies on the information of the coefficient being thresholded and the information contained in the neighboring coefficients is ignored. Similarly, the standard AC Shell denoising methods shrink the empirical coefficients independently, by comparing their magnitudes with a threshold value. The information of other coefficients has no influence on behavior of a particular coefficients. However, due to redundant representation of signals and coefficients obtained by AC Shells, the dependency of neighboring coefficients and the amount of shared information between them increases. Therefore, it would be vital to propose a new thresholding approach for AC Shells coefficients that considers the information of neighboring coefficients. In this chapter, we develop a new Bayesian denoising for AC Shell coefficients approach that integrates logistic regression, universal thresholding and Bayesian inference. We validate the proposed method using extensive simulations with various types of smooth and non-smooth signals. The results indicate that for all signal types including the neighbor coefficients would improve the denoising process, resulting in lower MSEs. Moreover, we applied our proposed methodology to a case study of denoising Atomic Force Microscopy (AFM) signals measuring the adhesion strength between two materials at the nano-newton scale to correctly identify the cantilever detachment point. Chapter 4. Bayesian Method in Combining Genetic and Historical Records of Transatlantic Slave Trade in the Americas. In the era between 1515 and 1865, more than 12 million people were enslaved and forced to move from Africa to North and Latin America. The shipping documents have recorded the origin and disembarkation of enslaved people. Traditionally, genealogy study has been done via the exploration of historical records, family tress and birth certificates. Due to recent advancements in the field of genetics, genealogy has been revolutionized and become more accurate. Although these methods can provide continental differentiation, they have poor spatial resolution that makes it hard to localize ancestry assignment as these markers are distributed across different sub-continental regions. To overcome the foregoing drawbacks, in this chapter, we propose a hybrid approach that combines the genetic markers results with the historical records of transatlantic voyage of enslaved people. Addition of the journey data can provide with substantially increased resolution in ancestry assignment, using a Bayesian modeling framework. The proposed Bayesian framework uses the voyage data from historical records available in the transatlantic slave trade database as prior probabilities and combine them with genetic markers of Afro-Americans, considered as the likelihood information to estimate the posterior (updated) probabilities of their ancestry assignments to geographical regions in Africa. We applied the proposed methodology to 60 Afro-American individuals and show that the prior information has increased the assignment probabilities obtained by the posterior distributions for some of the regions.Ph.D

    E-Nose Vapor Identification Based on Dempster-Shafer Fusion of Multiple Classifiers

    Get PDF
    Electronic nose (e-nose) vapor identification is an efficient approach to monitor air contaminants in space stations and shuttles in order to ensure the health and safety of astronauts. Data preprocessing (measurement denoising and feature extraction) and pattern classification are important components of an e-nose system. In this paper, a wavelet-based denoising method is applied to filter the noisy sensor measurements. Transient-state features are then extracted from the denoised sensor measurements, and are used to train multiple classifiers such as multi-layer perceptions (MLP), support vector machines (SVM), k nearest neighbor (KNN), and Parzen classifier. The Dempster-Shafer (DS) technique is used at the end to fuse the results of the multiple classifiers to get the final classification. Experimental analysis based on real vapor data shows that the wavelet denoising method can remove both random noise and outliers successfully, and the classification rate can be improved by using classifier fusion
    • …
    corecore