429 research outputs found

    Machine Learning And Image Processing For Noise Removal And Robust Edge Detection In The Presence Of Mixed Noise

    Get PDF
    The central goal of this dissertation is to design and model a smoothing filter based on the random single and mixed noise distribution that would attenuate the effect of noise while preserving edge details. Only then could robust, integrated and resilient edge detection methods be deployed to overcome the ubiquitous presence of random noise in images. Random noise effects are modeled as those that could emanate from impulse noise, Gaussian noise and speckle noise. In the first step, evaluation of methods is performed based on an exhaustive review on the different types of denoising methods which focus on impulse noise, Gaussian noise and their related denoising filters. These include spatial filters (linear, non-linear and a combination of them), transform domain filters, neural network-based filters, numerical-based filters, fuzzy based filters, morphological filters, statistical filters, and supervised learning-based filters. In the second step, switching adaptive median and fixed weighted mean filter (SAMFWMF) which is a combination of linear and non-linear filters, is introduced in order to detect and remove impulse noise. Then, a robust edge detection method is applied which relies on an integrated process including non-maximum suppression, maximum sequence, thresholding and morphological operations. The results are obtained on MRI and natural images. In the third step, a combination of transform domain-based filter which is a combination of dual tree – complex wavelet transform (DT-CWT) and total variation, is introduced in order to detect and remove Gaussian noise as well as mixed Gaussian and Speckle noise. Then, a robust edge detection is applied in order to track the true edges. The results are obtained on medical ultrasound and natural images. In the fourth step, a smoothing filter, which is a feed-forward convolutional network (CNN) is introduced to assume a deep architecture, and supported through a specific learning algorithm, l2 loss function minimization, a regularization method, and batch normalization all integrated in order to detect and remove impulse noise as well as mixed impulse and Gaussian noise. Then, a robust edge detection is applied in order to track the true edges. The results are obtained on natural images for both specific and non-specific noise-level

    Post-processing approaches for the improvement of cardiac ultrasound B-mode images:a review

    Get PDF

    Curvelet Denoising with Improved Thresholds for Application on Ultrasound Images

    Get PDF
    In medical image processing, image denoising has become a very essential exercise all through the diagnose. Negotiation between the preservation of useful diagnostic information and noise suppression must be treasured in medical images. In case of ultrasonic images a special type of acoustic noise, technically known as speckle noise, is the major factor of image quality degradation. Many denoising techniques have been proposed for effective suppression of speckle noise. Removing noise from the original image or signal is still a challenging problem for researchers. In this paper, a Curvelet transform based denoising with improved thresholds is proposed for ultrasound images

    Improved adaptive complex diffusion despeckling filter

    Get PDF
    Despeckling optical coherence tomograms from the human retina is a fundamental step to a better diagnosis or as a preprocessing stage for retinal layer segmentation. Both of these applications are particularly important in monitoring the progression of retinal disorders. In this study we propose a new formulation for a well-known nonlinear complex diffusion filter. A regularization factor is now made to be dependent on data, and the process itself is now an adaptive one. Experimental results making use of synthetic data show the good performance of the proposed formulation by achieving better quantitative results and increasing computation speed.Fundação para a Ciência e TecnologiaFEDERPrograma COMPET

    AO-Based High Resolution Image Post-Processing

    Get PDF

    Techniques for enhancing digital images

    Get PDF
    The images obtain from either research studies or optical instruments are often corrupted with noise. Image denoising involves the manipulation of image data to produce a visually high quality image. This thesis reviews the existing denoising algorithms and the filtering approaches available for enhancing images and/or data transmission. Spatial-domain and Transform-domain digital image filtering algorithms have been used in the past to suppress different noise models. The different noise models can be either additive or multiplicative. Selection of the denoising algorithm is application dependent. It is necessary to have knowledge about the noise present in the image so as to select the appropriated denoising algorithm. Noise models may include Gaussian noise, Salt and Pepper noise, Speckle noise and Brownian noise. The Wavelet Transform is similar to the Fourier transform with a completely different merit function. The main difference between Wavelet transform and Fourier transform is that, in the Wavelet Transform, Wavelets are localized in both time and frequency. In the standard Fourier Transform, Wavelets are only localized in frequency. Wavelet analysis consists of breaking up the signal into shifted and scales versions of the original (or mother) Wavelet. The Wiener Filter (mean squared estimation error) finds implementations as a LMS filter (least mean squares), RLS filter (recursive least squares), or Kalman filter. Quantitative measure (metrics) of the comparison of the denoising algorithms is provided by calculating the Peak Signal to Noise Ratio (PSNR), the Mean Square Error (MSE) value and the Mean Absolute Error (MAE) evaluation factors. A combination of metrics including the PSNR, MSE, and MAE are often required to clearly assess the model performance
    corecore