5 research outputs found

    Compression And Segmentation Of Images Using An Inter-Subband Wavelet Probability Model

    No full text
    Study of the statistical qualities of images can lead to elegant solutions for many classical image processing and computer vision problems. Image compression, restoration, interpolation, and texture segmentation can be written in terms of probabilistic decision making, where the prior model is the set of "natural images." Recently, we introduced a statistical characterization of natural images in the wavelet transform domain. This characterization describes the joint statistics between pairs of subband coefficient magnitudes at adjacent spatial locations, orientations, and scales. This paper describes our work in image compression and unsupervised segmentation that utilizes inter-subband dependencies. In order to support unsupervised segmentation with a large feature set, this paper presents a novel unsupervised algorithm to reduce feature dimensionality ca..

    Image compression via joint statistical characterization in the wavelet domain

    No full text
    We develop a statistical characterization of natural images in the wavelet transform domain. This characterization describes the joint statistics between pairs of subband coefficients at adjacent spatial locations, orientations, and scales. We observe that the raw coefficients are nearly decorrelated, but their magnitudes are highly correlated. A linear magnitude predictor coupled with both multiplicative and additive uncertainties accounts for the joint coefficient statistics of a wide variety of images including photographic images, graphical images, and medical images. In order to directly demonstrate the power of this model, we construct an image coder called EPWIC (Embedded Predictive Wavelet Image Coder), in which subband coefficients are encoded one bitplane at a time using a non-adaptive arithmetic encoder that utilizes probabilities calculated from the model. Bitplanes are ordered using a greedy algorithm that considers the MSE reduction per encoded bit. The decoder uses the statistical model to predict coefficient values based on the bits it has received. The rate-distortion performance of the coder compares favorably with the current best image coders in the literature.

    Progressive Wavelet Image Coding Based on a Conditional Probability Model

    No full text
    We present a wavelet image coder based on an explicit model of the conditional statistical relationships between coefficients in different subbands. In particular, we construct a parameterized model for the conditional probability of a coefficient given coefficients at a coarser scale. Subband coefficients are encoded one bitplane at a time using a non-adaptive arithmetic encoder. The overall ordering of bitplanes is determined by the ratio of their encoded variance to compressed size. We show rate-distortion comparisons of the coder to first and second-order theoretical entropy bounds and the EZW coder [1]. The coder is inherently embedded, and should prove useful in applications requiring progressive transmission. Orthonormal wavelet decompositions have proven to be extremely effective for image compression [2, 3, 4, 5, 1]. We believe there are several statistical reasons for this success. Similar to the Fourier transform, wavelets are quite good at decorrelating the second-order statistics of ..
    corecore