6,777 research outputs found

    Quantifying image distortion based on Gabor filter bank and multiple regression analysis

    Get PDF
    Image quality assessment is indispensable for image-based applications. The approaches towards image quality assessment fall into two main categories: subjective and objective methods. Subjective assessment has been widely used. However, careful subjective assessments are experimentally difficult and lengthy, and the results obtained may vary depending on the test conditions. On the other hand, objective image quality assessment would not only alleviate the difficulties described above but would also help to expand the application field. Therefore, several works have been developed for quantifying the distortion presented on a image achieving goodness of fit between subjective and objective scores up to 92%. Nevertheless, current methodologies are designed assuming that the nature of the distortion is known. Generally, this is a limiting assumption for practical applications, since in a majority of cases the distortions in the image are unknown. Therefore, we believe that the current methods of image quality assessment should be adapted in order to identify and quantify the distortion of images at the same time. That combination can improve processes such as enhancement, restoration, compression, transmission, among others. We present an approach based on the power of the experimental design and the joint localization of the Gabor filters for studying the influence of the spatial/frequencies on image quality assessment. Therefore, we achieve a correct identification and quantification of the distortion affecting images. This method provides accurate scores and differentiability between distortions

    WARP: Wavelets with adaptive recursive partitioning for multi-dimensional data

    Full text link
    Effective identification of asymmetric and local features in images and other data observed on multi-dimensional grids plays a critical role in a wide range of applications including biomedical and natural image processing. Moreover, the ever increasing amount of image data, in terms of both the resolution per image and the number of images processed per application, requires algorithms and methods for such applications to be computationally efficient. We develop a new probabilistic framework for multi-dimensional data to overcome these challenges through incorporating data adaptivity into discrete wavelet transforms, thereby allowing them to adapt to the geometric structure of the data while maintaining the linear computational scalability. By exploiting a connection between the local directionality of wavelet transforms and recursive dyadic partitioning on the grid points of the observation, we obtain the desired adaptivity through adding to the traditional Bayesian wavelet regression framework an additional layer of Bayesian modeling on the space of recursive partitions over the grid points. We derive the corresponding inference recipe in the form of a recursive representation of the exact posterior, and develop a class of efficient recursive message passing algorithms for achieving exact Bayesian inference with a computational complexity linear in the resolution and sample size of the images. While our framework is applicable to a range of problems including multi-dimensional signal processing, compression, and structural learning, we illustrate its work and evaluate its performance in the context of 2D and 3D image reconstruction using real images from the ImageNet database. We also apply the framework to analyze a data set from retinal optical coherence tomography

    The Haar Wavelet Transform of a Dendrogram: Additional Notes

    Get PDF
    We consider the wavelet transform of a finite, rooted, node-ranked, pp-way tree, focusing on the case of binary (p=2p = 2) trees. We study a Haar wavelet transform on this tree. Wavelet transforms allow for multiresolution analysis through translation and dilation of a wavelet function. We explore how this works in our tree context.Comment: 37 pp, 1 fig. Supplementary material to "The Haar Wavelet Transform of a Dendrogram", http://arxiv.org/abs/cs.IR/060810

    MDL Denoising Revisited

    Full text link
    We refine and extend an earlier MDL denoising criterion for wavelet-based denoising. We start by showing that the denoising problem can be reformulated as a clustering problem, where the goal is to obtain separate clusters for informative and non-informative wavelet coefficients, respectively. This suggests two refinements, adding a code-length for the model index, and extending the model in order to account for subband-dependent coefficient distributions. A third refinement is derivation of soft thresholding inspired by predictive universal coding with weighted mixtures. We propose a practical method incorporating all three refinements, which is shown to achieve good performance and robustness in denoising both artificial and natural signals.Comment: Submitted to IEEE Transactions on Information Theory, June 200

    Model for Estimation of Bounds in Digital Coding of Seabed Images

    Get PDF
    This paper proposes the novel model for estimation of bounds in digital coding of images. Entropy coding of images is exploited to measure the useful information content of the data. The bit rate achieved by reversible compression using the rate-distortion theory approach takes into account the contribution of the observation noise and the intrinsic information of hypothetical noise-free image. Assuming the Laplacian probability density function of the quantizer input signal, SQNR gains are calculated for image predictive coding system with non-adaptive quantizer for white and correlated noise, respectively. The proposed model is evaluated on seabed images. However, model presented in this paper can be applied to any signal with Laplacian distribution
    • …
    corecore