4,119 research outputs found

    A Novel Rate Control Algorithm for Onboard Predictive Coding of Multispectral and Hyperspectral Images

    Get PDF
    Predictive coding is attractive for compression onboard of spacecrafts thanks to its low computational complexity, modest memory requirements and the ability to accurately control quality on a pixel-by-pixel basis. Traditionally, predictive compression focused on the lossless and near-lossless modes of operation where the maximum error can be bounded but the rate of the compressed image is variable. Rate control is considered a challenging problem for predictive encoders due to the dependencies between quantization and prediction in the feedback loop, and the lack of a signal representation that packs the signal's energy into few coefficients. In this paper, we show that it is possible to design a rate control scheme intended for onboard implementation. In particular, we propose a general framework to select quantizers in each spatial and spectral region of an image so as to achieve the desired target rate while minimizing distortion. The rate control algorithm allows to achieve lossy, near-lossless compression, and any in-between type of compression, e.g., lossy compression with a near-lossless constraint. While this framework is independent of the specific predictor used, in order to show its performance, in this paper we tailor it to the predictor adopted by the CCSDS-123 lossless compression standard, obtaining an extension that allows to perform lossless, near-lossless and lossy compression in a single package. We show that the rate controller has excellent performance in terms of accuracy in the output rate, rate-distortion characteristics and is extremely competitive with respect to state-of-the-art transform coding

    A survey of parallel algorithms for fractal image compression

    Get PDF
    This paper presents a short survey of the key research work that has been undertaken in the application of parallel algorithms for Fractal image compression. The interest in fractal image compression techniques stems from their ability to achieve high compression ratios whilst maintaining a very high quality in the reconstructed image. The main drawback of this compression method is the very high computational cost that is associated with the encoding phase. Consequently, there has been significant interest in exploiting parallel computing architectures in order to speed up this phase, whilst still maintaining the advantageous features of the approach. This paper presents a brief introduction to fractal image compression, including the iterated function system theory upon which it is based, and then reviews the different techniques that have been, and can be, applied in order to parallelize the compression algorithm

    Lossless Intra Coding in HEVC with 3-tap Filters

    Full text link
    This paper presents a pixel-by-pixel spatial prediction method for lossless intra coding within High Efficiency Video Coding (HEVC). A well-known previous pixel-by-pixel spatial prediction method uses only two neighboring pixels for prediction, based on the angular projection idea borrowed from block-based intra prediction in lossy coding. This paper explores a method which uses three neighboring pixels for prediction according to a two-dimensional correlation model, and the used neighbor pixels and prediction weights change depending on intra mode. To find the best prediction weights for each intra mode, a two-stage offline optimization algorithm is used and a number of implementation aspects are discussed to simplify the proposed prediction method. The proposed method is implemented in the HEVC reference software and experimental results show that the explored 3-tap filtering method can achieve an average 11.34% bitrate reduction over the default lossless intra coding in HEVC. The proposed method also decreases average decoding time by 12.7% while it increases average encoding time by 9.7%Comment: 10 pages, 7 figure

    Data compression for full motion video transmission

    Get PDF
    Clearly transmission of visual information will be a major, if not dominant, factor in determining the requirements for, and assessing the performance of the Space Exploration Initiative (SEI) communications systems. Projected image/video requirements which are currently anticipated for SEI mission scenarios are presented. Based on this information and projected link performance figures, the image/video data compression requirements which would allow link closure are identified. Finally several approaches which could satisfy some of the compression requirements are presented and possible future approaches which show promise for more substantial compression performance improvement are discussed

    Optimizing Lossy Compression Rate-Distortion from Automatic Online Selection between SZ and ZFP

    Full text link
    With ever-increasing volumes of scientific data produced by HPC applications, significantly reducing data size is critical because of limited capacity of storage space and potential bottlenecks on I/O or networks in writing/reading or transferring data. SZ and ZFP are the two leading lossy compressors available to compress scientific data sets. However, their performance is not consistent across different data sets and across different fields of some data sets: for some fields SZ provides better compression performance, while other fields are better compressed with ZFP. This situation raises the need for an automatic online (during compression) selection between SZ and ZFP, with a minimal overhead. In this paper, the automatic selection optimizes the rate-distortion, an important statistical quality metric based on the signal-to-noise ratio. To optimize for rate-distortion, we investigate the principles of SZ and ZFP. We then propose an efficient online, low-overhead selection algorithm that predicts the compression quality accurately for two compressors in early processing stages and selects the best-fit compressor for each data field. We implement the selection algorithm into an open-source library, and we evaluate the effectiveness of our proposed solution against plain SZ and ZFP in a parallel environment with 1,024 cores. Evaluation results on three data sets representing about 100 fields show that our selection algorithm improves the compression ratio up to 70% with the same level of data distortion because of very accurate selection (around 99%) of the best-fit compressor, with little overhead (less than 7% in the experiments).Comment: 14 pages, 9 figures, first revisio

    Audiovisual preservation strategies, data models and value-chains

    No full text
    This is a report on preservation strategies, models and value-chains for digital file-based audiovisual content. The report includes: (a)current and emerging value-chains and business-models for audiovisual preservation;(b) a comparison of preservation strategies for audiovisual content including their strengths and weaknesses, and(c) a review of current preservation metadata models, and requirements for extension to support audiovisual files

    Advanced solutions for quality-oriented multimedia broadcasting

    Get PDF
    Multimedia content is increasingly being delivered via different types of networks to viewers in a variety of locations and contexts using a variety of devices. The ubiquitous nature of multimedia services comes at a cost, however. The successful delivery of multimedia services will require overcoming numerous technological challenges many of which have a direct effect on the quality of the multimedia experience. For example, due to dynamically changing requirements and networking conditions, the delivery of multimedia content has traditionally adopted a best effort approach. However, this approach has often led to the end-user perceived quality of multimedia-based services being negatively affected. Yet the quality of multimedia content is a vital issue for the continued acceptance and proliferation of these services. Indeed, end-users are becoming increasingly quality-aware in their expectations of multimedia experience and demand an ever-widening spectrum of rich multimedia-based services. As a consequence, there is a continuous and extensive research effort, by both industry and academia, to find solutions for improving the quality of multimedia content delivered to the users; as well, international standards bodies, such as the International Telecommunication Union (ITU), are renewing their effort on the standardization of multimedia technologies. There are very different directions in which research has attempted to find solutions in order to improve the quality of the rich media content delivered over various network types. It is in this context that this special issue on broadcast multimedia quality of the IEEE Transactions on Broadcasting illustrates some of these avenues and presents some of the most significant research results obtained by various teams of researchers from many countries. This special issue provides an example, albeit inevitably limited, of the richness and breath of the current research on multimedia broadcasting services. The research i- - ssues addressed in this special issue include, among others, factors that influence user perceived quality, encoding-related quality assessment and control, transmission and coverage-based solutions and objective quality measurements
    • …
    corecore