1,209 research outputs found

    Image blur estimation based on the average cone of ratio in the wavelet domain

    Get PDF
    In this paper, we propose a new algorithm for objective blur estimation using wavelet decomposition. The central idea of our method is to estimate blur as a function of the center of gravity of the average cone ratio (ACR) histogram. The key properties of ACR are twofold: it is powerful in estimating local edge regularity, and it is nearly insensitive to noise. We use these properties to estimate the blurriness of the image, irrespective of the level of noise. In particular, the center of gravity of the ACR histogram is a blur metric. The method is applicable both in case where the reference image is available and when there is no reference. The results demonstrate a consistent performance of the proposed metric for a wide class of natural images and in a wide range of out of focus blurriness. Moreover, the proposed method shows a remarkable insensitivity to noise compared to other wavelet domain methods

    A Reduced Reference Image Quality Measure Using Bessel K Forms Model for Tetrolet Coefficients

    Full text link
    In this paper, we introduce a Reduced Reference Image Quality Assessment (RRIQA) measure based on the natural image statistic approach. A new adaptive transform called "Tetrolet" is applied to both reference and distorted images. To model the marginal distribution of tetrolet coefficients Bessel K Forms (BKF) density is proposed. Estimating the parameters of this distribution allows to summarize the reference image with a small amount of side information. Five distortion measures based on the BKF parameters of the original and processed image are used to predict quality scores. A comparison between these measures is presented showing a good consistency with human judgment

    Assessment of speckle denoising filters for digital holography using subjective and objective evaluation models

    Get PDF
    Digital holography is an emerging imaging technique for displaying and sensing three dimensional objects. The perceived image quality of a hologram is frequently corrupted by speckle noise due to coherent illumination. Although several speckle noise reduction methods have been developed so far, there are scarce quality assessment studies to address their performance and they typically focus solely on objective metrics. However, these metrics do not reflect the visual quality perceived by a human observer. In this work, the performance of four speckle reduction algorithms, namely the nonlocal means, the Lee, the Frost and the block matching 3D filters, with varying parameterizations, were subjectively evaluated. The results were ranked with respect to the perceived image quality to obtain the mean opinion scores using pairwise comparison. The correlation between the subjective results and twenty different no-reference objective quality metrics was evaluated. The experiment indicates that block matching 3D and Lee are the preferred filters, depending on hologram characteristics. The best performing objective metrics were identified for each filter.info:eu-repo/semantics/publishedVersio

    Statistical Properties and Applications of Empirical Mode Decomposition

    Get PDF
    Signal analysis is key to extracting information buried in noise. The decomposition of signal is a data analysis tool for determining the underlying physical components of a processed data set. However, conventional signal decomposition approaches such as wavelet analysis, Wagner-Ville, and various short-time Fourier spectrograms are inadequate to process real world signals. Moreover, most of the given techniques require \emph{a prior} knowledge of the processed signal, to select the proper decomposition basis, which makes them improper for a wide range of practical applications. Empirical Mode Decomposition (EMD) is a non-parametric and adaptive basis driver that is capable of breaking-down non-linear, non-stationary signals into an intrinsic and finite components called Intrinsic Mode Functions (IMF). In addition, EMD approximates a dyadic filter that isolates high frequency components, e.g. noise, in higher index IMFs. Despite of being widely used in different applications, EMD is an ad hoc solution. The adaptive performance of EMD comes at the expense of formulating a theoretical base. Therefore, numerical analysis is usually adopted in literature to interpret the behavior. This dissertation involves investigating statistical properties of EMD and utilizing the outcome to enhance the performance of signal de-noising and spectrum sensing systems. The novel contributions can be broadly summarized in three categories: a statistical analysis of the probability distributions of the IMFs and a suggestion of Generalized Gaussian distribution (GGD) as a best fit distribution; a de-noising scheme based on a null-hypothesis of IMFs utilizing the unique filter behavior of EMD; and a novel noise estimation approach that is used to shift semi-blind spectrum sensing techniques into fully-blind ones based on the first IMF. These contributions are justified statistically and analytically and include comparison with other state of art techniques

    PHM survey: implementation of signal processing methods for monitoring bearings and gearboxes

    Get PDF
    The reliability and safety of industrial equipments are one of the main objectives of companies to remain competitive in sectors that are more and more exigent in terms of cost and security. Thus, an unexpected shutdown can lead to physical injury as well as economic consequences. This paper aims to show the emergence of the Prognostics and Health Management (PHM) concept in the industry and to describe how it comes to complement the different maintenance strategies. It describes the benefits to be expected by the implementation of signal processing, diagnostic and prognostic methods in health-monitoring. More specifically, this paper provides a state of the art of existing signal processing techniques that can be used in the PHM strategy. This paper allows showing the diversity of possible techniques and choosing among them the one that will define a framework for industrials to monitor sensitive components like bearings and gearboxes

    Signal processing algorithms for enhanced image fusion performance and assessment

    Get PDF
    The dissertation presents several signal processing algorithms for image fusion in noisy multimodal conditions. It introduces a novel image fusion method which performs well for image sets heavily corrupted by noise. As opposed to current image fusion schemes, the method has no requirements for a priori knowledge of the noise component. The image is decomposed with Chebyshev polynomials (CP) being used as basis functions to perform fusion at feature level. The properties of CP, namely fast convergence and smooth approximation, renders it ideal for heuristic and indiscriminate denoising fusion tasks. Quantitative evaluation using objective fusion assessment methods show favourable performance of the proposed scheme compared to previous efforts on image fusion, notably in heavily corrupted images. The approach is further improved by incorporating the advantages of CP with a state-of-the-art fusion technique named independent component analysis (ICA), for joint-fusion processing based on region saliency. Whilst CP fusion is robust under severe noise conditions, it is prone to eliminating high frequency information of the images involved, thereby limiting image sharpness. Fusion using ICA, on the other hand, performs well in transferring edges and other salient features of the input images into the composite output. The combination of both methods, coupled with several mathematical morphological operations in an algorithm fusion framework, is considered a viable solution. Again, according to the quantitative metrics the results of our proposed approach are very encouraging as far as joint fusion and denoising are concerned. Another focus of this dissertation is on a novel metric for image fusion evaluation that is based on texture. The conservation of background textural details is considered important in many fusion applications as they help define the image depth and structure, which may prove crucial in many surveillance and remote sensing applications. Our work aims to evaluate the performance of image fusion algorithms based on their ability to retain textural details from the fusion process. This is done by utilising the gray-level co-occurrence matrix (GLCM) model to extract second-order statistical features for the derivation of an image textural measure, which is then used to replace the edge-based calculations in an objective-based fusion metric. Performance evaluation on established fusion methods verifies that the proposed metric is viable, especially for multimodal scenarios
    • …
    corecore