9 research outputs found

    Lossy-to-lossless 3D image coding through prior coefficient lookup tables

    Get PDF
    This paper describes a low-complexity, highefficiency, lossy-to-lossless 3D image coding system. The proposed system is based on a novel probability model for the symbols that are emitted by bitplane coding engines. This probability model uses partially reconstructed coefficients from previous components together with a mathematical framework that captures the statistical behavior of the image. An important aspect of this mathematical framework is its generality, which makes the proposed scheme suitable for different types of 3D images. The main advantages of the proposed scheme are competitive coding performance, low computational load, very low memory requirements, straightforward implementation, and simple adaptation to most sensors

    Context-adaptive binary arithmetic coding with fixed-length codewords

    Get PDF
    Context-adaptive binary arithmetic coding is a widespread technique in the field of image and video coding. Most state-of-the-art arithmetic coders produce a (long) codeword of a priori unknown length. Its generation requires a renormalization procedure to permit progressive processing. This paper introduces two arithmetic coders that produce multiple codewords of fixed length. Contrary to the traditional approach, the generation of fixed-length codewords does not require renormalization since the whole interval arithmetic is stored in the coder's internal registers. The proposed coders employ a new context-adaptive mechanism based on variable-size sliding window that estimates with high precision the probability of the symbols coded. Their integration in coding systems is straightforward as demonstrated within the framework of JPEG2000. Experimental tests indicate that the proposed coders are computationally simpler than the MQ coder of JPEG2000 and the M coder of HEVC while achieving superior coding efficiency

    Bitplane image coding with parallel coefficient processing

    Get PDF
    Image coding systems have been traditionally tailored for multiple instruction, multiple data (MIMD) computing. In general, they partition the (transformed) image in codeblocks that can be coded in the cores of MIMD-based processors. Each core executes a sequential flow of instructions to process the coefficients in the codeblock, independently and asynchronously from the others cores. Bitplane coding is a common strategy to code such data. Most of its mechanisms require sequential processing of the coefficients. The last years have seen the upraising of processing accelerators with enhanced computational performance and power efficiency whose architecture is mainly based on the single instruction, multiple data (SIMD) principle. SIMD computing refers to the execution of the same instruction to multiple data in a lockstep synchronous way. Unfortunately, current bitplane coding strategies cannot fully profit from such processors due to inherently sequential coding task. This paper presents bitplane image coding with parallel coefficient (BPC-PaCo) processing, a coding method that can process many coefficients within a codeblock in parallel and synchronously. To this end, the scanning order, the context formation, the probability model, and the arithmetic coder of the coding engine have been re-formulated. The experimental results suggest that the penalization in coding performance of BPC-PaCo with respect to the traditional strategies is almost negligible

    Stationary probability model for microscopic parallelism in JPEG2000

    Get PDF
    Parallel processing is key to augmenting the throughput of image codecs. Despite numerous efforts to parallelize wavelet-based image coding systems, most attempts fail at the parallelization of the bitplane coding engine, which is the most computationally intensive stage of the coding pipeline. The main reason for this failure is the causality with which current coding strategies are devised, which assumes that one coefficient is coded after another. This work analyzes the mechanisms employed in bitplane coding and proposes alternatives to enhance opportunities for parallelism. We describe a stationary probability model that, without sacrificing the advantages of current approaches, removes the main obstacle to the parallelization of most coding strategies. Experimental tests evaluate the coding performance achieved by the proposed method in the framework of JPEG2000 when coding different types of images. Results indicate that the stationary probability model achieves similar coding performance, with slight increments or decrements depending on the image type and the desired level of parallelism

    Dual link image coding for earth observation satellites

    Get PDF
    The conventional strategy to download images captured by satellites is to compress the data on board and then transmit them via the downlink. It often happens that the capacity of the downlink is too small to accommodate all the acquired data, so the images are trimmed and/or transmitted through lossy regimes. This paper introduces a coding system that increases the amount and quality of the downloaded imaging data. The main insight of this paper is to use both the uplink and the downlink to code the images. The uplink is employed to send reference information to the satellite so that the onboard coding system can achieve higher efficiency. This reference information is computed on the ground, possibly employing extensive data and computational resources. The proposed system is called dual link image coding. As it is devised in this paper, it is suitable for Earth observation satellites with polar orbits. Experimental results obtained for data sets acquired by the Landsat 8 satellite indicate significant coding gains with respect to conventional methods

    Entropy-based evaluation of context models for wavelet-transformed images

    Get PDF
    Entropy is a measure of a message uncertainty. Among others aspects, it serves to determine the minimum coding rate that practical systems may attain. This paper defines an entropy-based measure to evaluate context models employed in wavelet-based image coding. The proposed measure is defined considering the mechanisms utilized by modern coding systems. It establishes the maximum performance achievable with each context model. This helps to determine the adequateness of the model under different coding conditions and serves to predict with high precision the coding rate achieved by practical systems. Experimental results evaluate four well-known context models using different types of images, coding rates, and transform strategies. They reveal that, under specific coding conditions, some widely-spread context models may not be as adequate as it is generally thought. The hints provided by this analysis may help to design simpler and more efficient wavelet-based image codecs

    Lossy-to-lossless 3D image coding through prior coefficient lookup tables

    No full text
    This paper describes a low-complexity, highefficiency, lossy-to-lossless 3D image coding system. The proposed system is based on a novel probability model for the symbols that are emitted by bitplane coding engines. This probability model uses partially reconstructed coefficients from previous components together with a mathematical framework that captures the statistical behavior of the image. An important aspect of this mathematical framework is its generality, which makes the proposed scheme suitable for different types of 3D images. The main advantages of the proposed scheme are competitive coding performance, low computational load, very low memory requirements, straightforward implementation, and simple adaptation to most sensors

    Stationary Probability Model for Microscopic Parallelism in JPEG2000

    Full text link

    Bitplane Image Coding With Parallel Coefficient Processing

    Full text link
    corecore