8,309 research outputs found

    New mixed adaptive detection algorithm for moving target with big data

    Get PDF
    Aiming at the troubles (such as complex background, illumination changes, shadows and others on traditional methods) for detecting of a walking person, we put forward a new adaptive detection algorithm through mixing Gaussian Mixture Model (GMM), edge detection algorithm and continuous frame difference algorithm in this paper. In time domain, the new algorithm uses GMM to model and updates the background. In spatial domain, it uses the hybrid detection algorithm which mixes the edge detection algorithm, continuous frame difference algorithm and GMM to get the initial contour of moving target with big data, and gets the ultimate moving target with big data. This algorithm not only can adapt to the illumination gradients and background disturbance occurred on scene, but also can solve some problems such as inaccurate target detection, incomplete edge detection, cavitation and ghost which usually appears in traditional algorithm. As experimental result showing, this algorithm holds better real-time and robustness. It is not only easily implemented, but also can accurately detect the moving target with big data

    Machine Learning And Image Processing For Noise Removal And Robust Edge Detection In The Presence Of Mixed Noise

    Get PDF
    The central goal of this dissertation is to design and model a smoothing filter based on the random single and mixed noise distribution that would attenuate the effect of noise while preserving edge details. Only then could robust, integrated and resilient edge detection methods be deployed to overcome the ubiquitous presence of random noise in images. Random noise effects are modeled as those that could emanate from impulse noise, Gaussian noise and speckle noise. In the first step, evaluation of methods is performed based on an exhaustive review on the different types of denoising methods which focus on impulse noise, Gaussian noise and their related denoising filters. These include spatial filters (linear, non-linear and a combination of them), transform domain filters, neural network-based filters, numerical-based filters, fuzzy based filters, morphological filters, statistical filters, and supervised learning-based filters. In the second step, switching adaptive median and fixed weighted mean filter (SAMFWMF) which is a combination of linear and non-linear filters, is introduced in order to detect and remove impulse noise. Then, a robust edge detection method is applied which relies on an integrated process including non-maximum suppression, maximum sequence, thresholding and morphological operations. The results are obtained on MRI and natural images. In the third step, a combination of transform domain-based filter which is a combination of dual tree – complex wavelet transform (DT-CWT) and total variation, is introduced in order to detect and remove Gaussian noise as well as mixed Gaussian and Speckle noise. Then, a robust edge detection is applied in order to track the true edges. The results are obtained on medical ultrasound and natural images. In the fourth step, a smoothing filter, which is a feed-forward convolutional network (CNN) is introduced to assume a deep architecture, and supported through a specific learning algorithm, l2 loss function minimization, a regularization method, and batch normalization all integrated in order to detect and remove impulse noise as well as mixed impulse and Gaussian noise. Then, a robust edge detection is applied in order to track the true edges. The results are obtained on natural images for both specific and non-specific noise-level

    Review on Colour Image Denoising using Wavelet Soft Thresholding Technique

    Get PDF
    In this modern age of communication the image and video is important as Visual information transmitted in the form of digital images, but after the transmission image is often ruined with noise. Therefore the received image needs to be processing before it can be used for further applications. Image denoising implicates the manipulation of the image data to produce a high quality of image without any noise. Most of the work which had done in color scale image is by filter domain approach, but we think that the transform domain approach give great result in the field of color image denoising.. This paper reviews the several types of noise which corrupted the color image and also the existing denoising algorithms based on wavelet threshodling technique. DOI: 10.17762/ijritcc2321-8169.15039

    Wavelet Based Color Image Denoising through a Bivariate Pearson Distribution

    Get PDF
    In this paper we proposed an efficient algorithm for Colo r Image Denoising through a Bivariate Pearson Distribution using Wavelet Which is based on Bayesian denoising and if Bayesian denoising is used for recovering image from the noisy image the performance is strictly depend on the correctness of the distribution that is used to describe the data. In the denoising process we require a selection of p roper model for distribution. To describe the image data bivariate pearson distribution is used and Gaussian distribution is used to describe the noise particles in this paper. For gray scale image lots of extensive works has been don e in this field but fo r colour image denoising using bivariate pearson distribution based on bayesian denoising gives us tremendous result for analy sing coloured images which can be used in several advanced applications. The bivariate probability density function (pdf) takes in t o account the Gaussian dependency among wavelet coefficients. The experimental results show that the proposed technique outperforms sev eral exiting methods both visually and in terms of peak signal - to - noise ratio (PSNR)

    An overview of the fundamental approaches that yield several image denoising techniques

    Get PDF
    Digital image is considered as a powerful tool to carry and transmit information between people. Thus, it attracts the attention of large number of researchers, among them those interested in preserving the image features from any factors that may reduce the image quality. One of these factors is the noise which affects the visual aspect of the image and makes others image processing more difficult. Thus far, solving this noise problem remains a challenge for the researchers in this field. A lot of image denoising techniques have been introduced in order to remove the noise by taking care of the image features; in other words, getting the best similarity to the original image from the noisy one. However, the findings are still inconclusive. Beside the enormous amount of researches and studies which adopt several mathematical concepts (statistics, probabilities, modeling, PDEs, wavelet, fuzzy logic, etc.), there is also the scarcity of review papers which carry an important role in the development and progress of research. Thus, this review paper intorduce an overview of the different fundamental approaches that yield the several image-denoising techniques, presented with a new classification. Furthermore, the paper presents the different evaluation tools needed on the comparison between these techniques in order to facilitate the processing of this noise problem, among a great diversity of techniques and concepts

    Optimum Image Filters for Various Types of Noise

    Get PDF
    In this paper, the quality performance of several filters in restoration of images corrupted with various types of noise has been examined extensively. In particular, Wiener filter, Gaussian filter, median filter and averaging (mean) filter have been used to reduce Gaussian noise, speckle noise, salt and pepper noise and Poisson noise. Many images have been tested, two of which are shown in this paper. Several percentages of noise corrupting the images have been examined in the simulations. The size of the sliding window is the same in the four filters used, namely 5x5 for all the indicated noise percentages. For image quality measurement, two performance measuring indices are used: peak signal-to-noise ratio (PSNR) and structural similarity (SSIM). The simulation results show that the performance of some specific filters in reducing some types of noise are much better than others. It has been illustrated that median filter is more appropriate for eliminating salt and pepper noise. Averaging filter still works well for such type of noise, but of less performance quality than the median filter. Gaussian and Wiener filters outperform other filters in restoring mages corrupted with Poisson and speckle noise

    Automatic Denoising and Unmixing in Hyperspectral Image Processing

    Get PDF
    This thesis addresses two important aspects in hyperspectral image processing: automatic hyperspectral image denoising and unmixing. The first part of this thesis is devoted to a novel automatic optimized vector bilateral filter denoising algorithm, while the remainder concerns nonnegative matrix factorization with deterministic annealing for unsupervised unmixing in remote sensing hyperspectral images. The need for automatic hyperspectral image processing has been promoted by the development of potent hyperspectral systems, with hundreds of narrow contiguous bands, spanning the visible to the long wave infrared range of the electromagnetic spectrum. Due to the large volume of raw data generated by such sensors, automatic processing in the hyperspectral images processing chain is preferred to minimize human workload and achieve optimal result. Two of the mostly researched processing for such automatic effort are: hyperspectral image denoising, which is an important preprocessing step for almost all remote sensing tasks, and unsupervised unmixing, which decomposes the pixel spectra into a collection of endmember spectral signatures and their corresponding abundance fractions. Two new methodologies are introduced in this thesis to tackle the automatic processing problems described above. Vector bilateral filtering has been shown to provide good tradeoff between noise removal and edge degradation when applied to multispectral/hyperspectral image denoising. It has also been demonstrated to provide dynamic range enhancement of bands that have impaired signal to noise ratios. Typical vector bilateral filtering usage does not employ parameters that have been determined to satisfy optimality criteria. This thesis also introduces an approach for selection of the parameters of a vector bilateral filter through an optimization procedure rather than by ad hoc means. The approach is based on posing the filtering problem as one of nonlinear estimation and minimizing the Stein\u27s unbiased risk estimate (SURE) of this nonlinear estimator. Along the way, this thesis provides a plausibility argument with an analytical example as to why vector bilateral filtering outperforms band-wise 2D bilateral filtering in enhancing SNR. Experimental results show that the optimized vector bilateral filter provides improved denoising performance on multispectral images when compared to several other approaches. Non-negative matrix factorization (NMF) technique and its extensions were developed to find part based, linear representations of non-negative multivariate data. They have been shown to provide more interpretable results with realistic non-negative constrain in unsupervised learning applications such as hyperspectral imagery unmixing, image feature extraction, and data mining. This thesis extends the NMF method by incorporating deterministic annealing optimization procedure, which will help solve the non-convexity problem in NMF and provide a better choice of sparseness constrain. The approach is based on replacing the difficult non-convex optimization problem of NMF with an easier one by adding an auxiliary convex entropy constrain term and solving this first. Experiment results with hyperspectral unmixing application show that the proposed technique provides improved unmixing performance compared to other state-of-the-art methods
    • …
    corecore