223 research outputs found

    Development and evaluation of packet video schemes

    Get PDF
    Reflecting the two tasks proposed for the current year, namely a feasibility study of simulating the NASA network, and a study of progressive transmission schemes, are presented. The view of the NASA network, gleaned from the various technical reports made available to use, is provided. Also included is a brief overview of how the current simulator could be modified to accomplish the goal of simulating the NASA network. As the material in this section would be the basis for the actual simulation, it is important to make sure that it is an accurate reflection of the requirements on the simulator. Brief descriptions of the set of progressive transmission algorithms selected for the study are contained. The results available in the literature were obtained under a variety of different assumptions, not all of which are stated. As such, the only way to compare the efficiency and the implementational complexity of the various algorithms is to simulate them

    Progressive transmission of pseudo-color images. Appendix 1: Item 4

    Get PDF
    The transmission of digital images can require considerable channel bandwidth. The cost of obtaining such a channel can be prohibitive, or the channel might simply not be available. In this case, progressive transmission (PT) can be useful. PT presents the user with a coarse initial image approximation, and then proceeds to refine it. In this way, the user tends to receive information about the content of the image sooner than if a sequential transmission method is used. PT finds application in image data base browsing, teleconferencing, medical and other applications. A PT scheme is developed for use with a particular type of image data, the pseudo-color or color mapped image. Such images consist of a table of colors called a colormap, plus a 2-D array of index values which indicate which colormap entry is to be used to display a given pixel. This type of image presents some unique problems for a PT coder, and techniques for overcoming these problems are developed. A computer simulation of the color mapped PT scheme is developed to evaluate its performance. Results of simulation using several test images are presented

    Context-Based Scalable Coding and Representation of High Resolution Art Pictures for Remote Data Access

    Get PDF
    International audienceEROS is the largest database in the world of high resolution art pictures. The TSAR project is designed to open it in a secure, efficient and user-friendly way that involves cryptography and watermarking as well as compression and region-level representation abilities. This paper more particularly addresses the two last points. The LAR codec is first presented as a suitable solution for picture encoding with compression ranging from highly lossy to lossless. Then, we detail the concept of self-extracting region representation, which consists of performing a segmentation process at both the coder and decoder from a highly compressed image, and later locally enhancing the image in a region of interest. The overall scheme provides an efficient, consistent solution for advanced data browsing

    Wavelet Based Image Coding Schemes : A Recent Survey

    Full text link
    A variety of new and powerful algorithms have been developed for image compression over the years. Among them the wavelet-based image compression schemes have gained much popularity due to their overlapping nature which reduces the blocking artifacts that are common phenomena in JPEG compression and multiresolution character which leads to superior energy compaction with high quality reconstructed images. This paper provides a detailed survey on some of the popular wavelet coding techniques such as the Embedded Zerotree Wavelet (EZW) coding, Set Partitioning in Hierarchical Tree (SPIHT) coding, the Set Partitioned Embedded Block (SPECK) Coder, and the Embedded Block Coding with Optimized Truncation (EBCOT) algorithm. Other wavelet-based coding techniques like the Wavelet Difference Reduction (WDR) and the Adaptive Scanned Wavelet Difference Reduction (ASWDR) algorithms, the Space Frequency Quantization (SFQ) algorithm, the Embedded Predictive Wavelet Image Coder (EPWIC), Compression with Reversible Embedded Wavelet (CREW), the Stack-Run (SR) coding and the recent Geometric Wavelet (GW) coding are also discussed. Based on the review, recommendations and discussions are presented for algorithm development and implementation.Comment: 18 pages, 7 figures, journa

    Virtually Lossless Compression of Astrophysical Images

    Get PDF
    We describe an image compression strategy potentially capable of preserving the scientific quality of astrophysical data, simultaneously allowing a consistent bandwidth reduction to be achieved. Unlike strictly lossless techniques, by which moderate compression ratios are attainable, and conventional lossy techniques, in which the mean square error of the decoded data is globally controlled by users, near-lossless methods are capable of locally constraining the maximum absolute error, based on user's requirements. An advanced lossless/near-lossless differential pulse code modulation (DPCM) scheme, recently introduced by the authors and relying on a causal spatial prediction, is adjusted to the specific characteristics of astrophysical image data (high radiometric resolution, generally low noise, etc.). The background noise is preliminarily estimated to drive the quantization stage for high quality, which is the primary concern in most of astrophysical applications. Extensive experimental results of lossless, near-lossless, and lossy compression of astrophysical images acquired by the Hubble space telescope show the advantages of the proposed method compared to standard techniques like JPEG-LS and JPEG2000. Eventually, the rationale of virtually lossless compression, that is, a noise-adjusted lossles/near-lossless compression, is highlighted and found to be in accordance with concepts well established for the astronomers' community

    Contemporary Affirmation of SPIHT Improvements in Image Coding

    Get PDF
    Set partitioning in hierarchal trees (SPIHT) is actually a widely-used compression algorithm for wavelet altered images. On most algorithms developed, SPIHT algorithm from the time its introduction in 1996 for image compression has got lots of interest. Though SPIHT is considerably simpler and efficient than several present compression methods since it's a completely inserted codec, provides good image quality, large PSNR, optimized for modern image transmission, efficient conjunction with error defense, form information on demand and hence element powerful error correction decreases from starting to finish but still it has some downsides that need to be taken away for its better use therefore since its development it has experienced many adjustments in its original model. This document presents a survey on several different improvements in SPIHT in certain fields as velocity, redundancy, quality, error resilience, sophistication, and compression ratio and memory requirement

    A software system for laboratory experiments in image processing

    Get PDF
    Laboratory experiments for image processing courses are usually software implementations of processing algorithms, but students of image processing come from diverse backgrounds with widely differing software experience. To avoid learning overhead, the software system should be easy to learn and use, even for those with no exposure to mathematical programming languages or object-oriented programming. The class library for image processing (CLIP) supports users with knowledge of C, by providing three C++ types with small public interfaces, including natural and efficient operator overloading. CLIP programs are compact and fast. Experience in using the system in undergraduate and graduate teaching indicates that it supports subject matter learning with little distraction from language/system learning

    Locally Adaptive Resolution (LAR) codec

    Get PDF
    The JPEG committee has initiated a study of potential technologies dedicated to future generation image compression systems. The idea is to design a new norm of image compression, named JPEG AIC (Advanced Image Coding), together with advanced evaluation methodologies, closely matching to human vision system characteristics. JPEG AIC thus aimed at defining a complete coding system able to address advanced functionalities such as lossy to lossless compression, scalability (spatial, temporal, depth, quality, complexity, component, granularity...), robustness, embed-ability, content description for image handling at object level... The chosen compression method would have to fit perceptual metrics defined by the JPEG community within the JPEG AIC project. In this context, we propose the Locally Adaptive Resolution (LAR) codec as a contribution to the relative call for technologies, tending to fit all of previous functionalities. This method is a coding solution that simultaneously proposes a relevant representation of the image. This property is exploited through various complementary coding schemes in order to design a highly scalable encoder. The LAR method has been initially introduced for lossy image coding. This efficient image compression solution relies on a content-based system driven by a specific quadtree representation, based on the assumption that an image can be represented as layers of basic information and local texture. Multiresolution versions of this codec have shown their efficiency, from low bit rates up to lossless compressed images. An original hierarchical self-extracting region representation has also been elaborated: a segmentation process is realized at both coder and decoder, leading to a free segmentation map. This later can be further exploited for color region encoding, image handling at region level. Moreover, the inherent structure of the LAR codec can be used for advanced functionalities such as content securization purposes. In particular, dedicated Unequal Error Protection systems have been produced and tested for transmission over the Internet or wireless channels. Hierarchical selective encryption techniques have been adapted to our coding scheme. Data hiding system based on the LAR multiresolution description allows efficient content protection. Thanks to the modularity of our coding scheme, complexity can be adjusted to address various embedded systems. For example, basic version of the LAR coder has been implemented onto FPGA platform while respecting real-time constraints. Pyramidal LAR solution and hierarchical segmentation process have also been prototyped on DSPs heterogeneous architectures. This chapter first introduces JPEG AIC scope and details associated requirements. Then we develop the technical features, of the LAR system, and show the originality of the proposed scheme, both in terms of functionalities and services. In particular, we show that the LAR coder remains efficient for natural images, medical images, and art images

    Map online system using internet-based image catalogue

    Get PDF
    Digital maps carry along its geodata information such as coordinate that is important in one particular topographic and thematic map. These geodatas are meaningful especially in military field. Since the maps carry along this information, its makes the size of the images is too big. The bigger size, the bigger storage is required to allocate the image file. It also can cause longer loading time. These conditions make it did not suitable to be applied in image catalogue approach via internet environment. With compression techniques, the image size can be reduced and the quality of the image is still guaranteed without much changes. This report is paying attention to one of the image compression technique using wavelet technology. Wavelet technology is much batter than any other image compression technique nowadays. As a result, the compressed images applied to a system called Map Online that used Internet-based Image Catalogue approach. This system allowed user to buy map online. User also can download the maps that had been bought besides using the searching the map. Map searching is based on several meaningful keywords. As a result, this system is expected to be used by Jabatan Ukur dan Pemetaan Malaysia (JUPEM) in order to make the organization vision is implemented
    • …
    corecore