961 research outputs found

    Simulated Annealing for JPEG Quantization

    Full text link
    JPEG is one of the most widely used image formats, but in some ways remains surprisingly unoptimized, perhaps because some natural optimizations would go outside the standard that defines JPEG. We show how to improve JPEG compression in a standard-compliant, backward-compatible manner, by finding improved default quantization tables. We describe a simulated annealing technique that has allowed us to find several quantization tables that perform better than the industry standard, in terms of both compressed size and image fidelity. Specifically, we derive tables that reduce the FSIM error by over 10% while improving compression by over 20% at quality level 95 in our tests; we also provide similar results for other quality levels. While we acknowledge our approach can in some images lead to visible artifacts under large magnification, we believe use of these quantization tables, or additional tables that could be found using our methodology, would significantly reduce JPEG file sizes with improved overall image quality.Comment: Appendix not included in arXiv version due to size restrictions. For full paper go to: http://www.eecs.harvard.edu/~michaelm/SimAnneal/PAPER/simulated-annealing-jpeg.pd

    A spectral multi-resolution image encoding network

    Get PDF
    After a short introduction into traditional image transform coding, multirate systems and multiscale signal coding the paper focuses on the subject of image encoding by a neural network. Taking also noise into account a network model is proposed which not only learns the optimal localized basis functions for the transform but also learns to implement a whitening filter by multi-resolution encoding. A simulation showing the multi-resolution capabilitys concludes the contribution

    Decompression of JPEG Document Images: A Survey Paper

    Get PDF
    JPEG Decompression techniques are very useful in 3G/4G based markets, handheld devices and infrastructures. There are many challenging issues in previously proposed decompression methods, like very high computational cost, and heavy distortion in ringing and blocking artifacts which makes the image invisible. To improve the visual quality of the JPEG document images at low bit rate and at low computational cost, we are going to implement the decompression technique for JPEG document images. We first divide the JPEG document image into smooth and non-smooth blocks with the help of Discrete Cosine Transform (DCT). Then the smooth blocks (background , uniform region) are decoded in the transform domain by minimizing the Total Block Boundary Variation(TBBV). In this we propose to compute the block variation directly in the DCT domain at the super pixel level. The super pixel have size n*n, each super pixel is assigned with an average intensity value. The smooth blocks are then reconstructed by using the Newton’s method. The implementation of the smooth block decompression will be done here. The non-smooth blocks of the document image contains the text and graphics/line drawing objects. The post processing algorithm will be introduced which takes into consideration the specificities of document content. The inverse DCT is applied to represent the image in spatial domain. So the implementation of the non-smooth block decompression will be done here. Finally, we design different experimental results and analyze that our system is better than the existing. And it will show the quality improvement of decompressed JPEG document image

    Strong edge features for image coding

    Get PDF
    A two-component model is proposed for perceptual image coding. For the first component of the model, the watershed operator is used to detect strong edge features. Then, an efficient morphological interpolation algorithm reconstructs the smooth areas of the image from the extracted edge information, also known as sketch data. The residual component, containing fine textures, is separately coded by a subband coding scheme. The morphological operators involved in the coding of the primary component perform very efficiently compared to conventional techniques like the LGO operator, used for the edge extraction, or the diffusion filters, iteratively applied for the interpolation of smooth areas in previously reported sketch-based coding schemes.Peer ReviewedPostprint (published version

    Video special effects editing in MPEG-2 compressed video

    Get PDF

    Compression through decomposition into browse and residual images

    Get PDF
    Economical archival and retrieval of image data is becoming increasingly important considering the unprecedented data volumes expected from the Earth Observing System (EOS) instruments. For cost effective browsing the image data (possibly from remote site), and retrieving the original image data from the data archive, we suggest an integrated image browse and data archive system employing incremental transmission. We produce our browse image data with the JPEG/DCT lossy compression approach. Image residual data is then obtained by taking the pixel by pixel differences between the original data and the browse image data. We then code the residual data with a form of variable length coding called diagonal coding. In our experiments, the JPEG/DCT is used at different quality factors (Q) to generate the browse and residual data. The algorithm has been tested on band 4 of two Thematic mapper (TM) data sets. The best overall compression ratios (of about 1.7) were obtained when a quality factor of Q=50 was used to produce browse data at a compression ratio of 10 to 11. At this quality factor the browse image data has virtually no visible distortions for the images tested
    • …
    corecore