258 research outputs found

    Compound document compression with model-based biased reconstruction

    Get PDF
    The usefulness of electronic document delivery and archives rests in large part on advances in compression technology. Documents can contain complex layouts with different data types, such as text and images, having different statistical characteristics. To achieve better image quality, it is important to make use of such characteristics in compression. We exploit the transform coefficient distributions for text and images. We show that the scheme in base-line JPEG does not lead to minimum mean-square error if we have models of these coefficients. Instead, we discuss an algorithm designed for this performance that involves first classifying the blocks, and then estimating the parameters to enable a biased reconstruction in the decompression value. Simulation results are shown to validate the advantages of this method. © 2004 SPIE and IS&T.published_or_final_versio

    JPEG compression of monochrome 2D-barcode images using DCT coefficient distributions

    Get PDF
    Two dimensional (2D) barcodes are becoming a pervasive interface for mobile devices, such as camera phones. Often, only monochrome 2D-barcodes are used due to their robustness in an uncontrolled operating environment of camera phones. Most camera phones capture and store such 2D-barcode images in the baseline JPEG format. As a lossy compression technique, JPEG does introduce a fair amount of error in the decoding of captured 2D-barcode images. In this paper, we introduce an improved JPEG compression scheme for such barcode images. By altering the JPEG compression parameters based on the DCT coefficient distribution of such barcode images, the improved compression scheme produces JPEG images with higher PSNR value as compared to the baseline implementation. We have also applied our improved scheme to a real 2D-barcode system - the QR Code and analyzed its performance against the baseline JPEG scheme

    A document image model and estimation algorithm for optimized JPEG decompression

    Get PDF
    The JPEG standard is one of the most prevalent image compression schemes in use today. While JPEG was designed for use with natural images, it is also widely used for the encoding of raster documents. Unfortunately, JPEG\u27s characteristic blocking and ringing artifacts can severely degrade the quality of text and graphics in complex documents. We propose a JPEG decompression algorithm which is designed to produce substantially higher quality images from the same standard JPEG encodings. The method works by incorporating a document image model into the decoding process which accounts for the wide variety of content in modern complex color documents. The method works by first segmenting the JPEG encoded document into regions corresponding to background, text, and picture content. The regions corresponding to text and background are then decoded using maximum a posteriori (MAP) estimation. Most importantly, the MAP reconstruction of the text regions uses a model which accounts for the spatial characteristics of text and graphics. Our experimental comparisons to the baseline JPEG decoding as well as to three other decoding schemes, demonstrate that our method substantially improves the quality of decoded images, both visually and as measured by PSNR

    Improving mobile color 2D-barcode JPEG image readability using DCT coefficient distributions

    Get PDF
    Two dimensional (2D) barcodes are becoming a pervasive interface for mobile devices, such as camera smartphones. Often, only monochrome 2D-barcodes are used due to their robustness in an uncontrolled operating environment of smartphones. Nonetheless, we are seeing an emerging use of color 2D-barcodes for camera smartphones. Most smartphones capture and store such 2D-barcode images in the baseline JPEG format. As a lossy compression technique, JPEG does introduce a fair amount of error in the captured 2D-barcode images. In this paper, we analyzed the Discrete Cosine Transform (DCT) coefficient distributions of generalized 2D-barcodes using colored data cells, each comprising of 4, 8 and 10 colors. Using these DCT distributions, we improved the JPEG compression of such mobile barcode images. By altering the JPEG compression parameters based on the DCT coefficient distribution of the barcode images, our improved compression scheme produces JPEG images with higher PSNR value as compared to the baseline implementation. We have also applied our improved scheme to a 10 colors 2D-barcode system; and analyzed its performance in comparison to the default and alternative JPEG schemes. We have found that our improved scheme does provide a marked improvement for the successful decoding of the 10 colors 2D-barcode system

    The JPEG2000 still image compression standard

    Get PDF
    The development of standards (emerging and established) by the International Organization for Standardization (ISO), the International Telecommunications Union (ITU), and the International Electrotechnical Commission (IEC) for audio, image, and video, for both transmission and storage, has led to worldwide activity in developing hardware and software systems and products applicable to a number of diverse disciplines [7], [22], [23], [55], [56], [73]. Although the standards implicitly address the basic encoding operations, there is freedom and flexibility in the actual design and development of devices. This is because only the syntax and semantics of the bit stream for decoding are specified by standards, their main objective being the compatibility and interoperability among the systems (hardware/software) manufactured by different companies. There is, thus, much room for innovation and ingenuity. Since the mid 1980s, members from both the ITU and the ISO have been working together to establish a joint international standard for the compression of grayscale and color still images. This effort has been known as JPEG, the Join

    Optimum Implementation of Compound Compression of a Computer Screen for Real-Time Transmission in Low Network Bandwidth Environments

    Get PDF
    Remote working is becoming increasingly more prevalent in recent times. A large part of remote working involves sharing computer screens between servers and clients. The image content that is presented when sharing computer screens consists of both natural camera captured image data as well as computer generated graphics and text. The attributes of natural camera captured image data differ greatly to the attributes of computer generated image data. An image containing a mixture of both natural camera captured image and computer generated image data is known as a compound image. The research presented in this thesis focuses on the challenge of constructing a compound compression strategy to apply the ‘best fit’ compression algorithm for the mixed content found in a compound image. The research also involves analysis and classification of the types of data a given compound image may contain. While researching optimal types of compression, consideration is given to the computational overhead of a given algorithm because the research is being developed for real time systems such as cloud computing services, where latency has a detrimental impact on end user experience. The previous and current state of the art videos codec’s have been researched along many of the most current publishing’s from academia, to design and implement a novel approach to a low complexity compound compression algorithm that will be suitable for real time transmission. The compound compression algorithm will utilise a mixture of lossless and lossy compression algorithms with parameters that can be used to control the performance of the algorithm. An objective image quality assessment is needed to determine whether the proposed algorithm can produce an acceptable quality image after processing. Both traditional metrics such as Peak Signal to Noise Ratio will be used along with a new more modern approach specifically designed for compound images which is known as Structural Similarity Index will be used to define the quality of the decompressed Image. In finishing, the compression strategy will be tested on a set of generated compound images. Using open source software, the same images will be compressed with the previous and current state of the art video codec’s to compare the three main metrics, compression ratio, computational complexity and objective image quality

    An overview of JPEG 2000

    Get PDF
    JPEG-2000 is an emerging standard for still image compression. This paper provides a brief history of the JPEG-2000 standardization process, an overview of the standard, and some description of the capabilities provided by the standard. Part I of the JPEG-2000 standard specifies the minimum compliant decoder, while Part II describes optional, value-added extensions. Although the standard specifies only the decoder and bitstream syntax, in this paper we describe JPEG-2000 from the point of view of encoding. We take this approach, as we believe it is more amenable to a compact description more easily understood by most readers.

    A digital signature and watermarking based authentication system for JPEG2000 images

    Get PDF
    In this thesis, digital signature based authentication system was introduced, which is able to protect JPEG2000 images in different flavors, including fragile authentication and semi-fragile authentication. The fragile authentication is to protect the image at code-stream level, and the semi-fragile is to protect the image at the content level. The semi-fragile can be further classified into lossy and lossless authentication. With lossless authentication, the original image can be recovered after verification. The lossless authentication and the new image compression standard, JPEG2000 is mainly discussed in this thesis

    The JPEG2000 still image coding system: An overview

    Get PDF
    With the increasing use of multimedia technologies, image compression requires higher performance as well as new features. To address this need in the specific area of still image encoding, a new standard is currently being developed, the JPEG2000. It is not only intended to provide rate-distortion and subjective image quality performance superior to existing standards, but also to provide features and functionalities that current standards can either not address efficiently or in many cases cannot address at all. Lossless and lossy compression, embedded lossy to lossless coding, progressive transmission by pixel accuracy and by resolution, robustness to the presence of bit-errors and region-of-interest coding, are some representative features. It is interesting to note that JPEG2000 is being designed to address the requirements of a diversity of applications, e.g. Internet, color facsimile, printing, scanning, digital photography, remote sensing, mobile applications, medical imagery, digital library and E-commerce

    The JPEG 2000 still image compression standard

    Get PDF
    With the increasing use of multimedia technologies, image compression requires higher performance as well as new features. To address this need in the specific area of still image encoding, a new standard is currently being developed, the JPEC2000. It is not only intended to provide rate-distortion and subjective image quality performance superior to existing standards, but also to provide features and functionalities that current standards can either not address efficiently or in many cases cannot address at all. Lossless and lossy compression, embedded lossy to lossless coding, progressive transmission by pixel accuracy and by resolution, robustness to the presence of bit-errors and region-of-interest coding, are some representative features. It is interesting to note that JPEG2000 is being designed to address the requirements of a diversity of applications, e.g. Internet, color facsimile, printing, scanning, digital photography, remote sensing, mobile applications, medical imagery, digital library and E-commerce
    • …
    corecore