732 research outputs found

    Bulking II: Classifications of Cellular Automata

    Get PDF
    This paper is the second part of a series of two papers dealing with bulking: a way to define quasi-order on cellular automata by comparing space-time diagrams up to rescaling. In the present paper, we introduce three notions of simulation between cellular automata and study the quasi-order structures induced by these simulation relations on the whole set of cellular automata. Various aspects of these quasi-orders are considered (induced equivalence relations, maximum elements, induced orders, etc) providing several formal tools allowing to classify cellular automata

    Dual-DCT-Lifting-Based Lapped Transform with Improved Reversible Symmetric Extension

    Get PDF
    We present a lifting-based lapped transform (L-LT) and a reversible symmetric extension (RSE) in the boundary processing for more effective lossy-to-lossless image coding of data with various qualities from only one piece of lossless compressed data. The proposed dual-DCT-lifting-based LT (D2L-LT) parallel processes two identical LTs and consists of 1-D and 2-D DCT-liftings which allow the direct use of a DCT matrix in each lifting coefficient. Since the DCT-lifting can utilize any existing DCT software or hardware, it has great potential for elegant implementations that are dependent on the architecture and DCT algorithm used. In addition, we present an improved RSE (IRSE) that works by recalculating the boundary processing and solves the boundary problem that the DCT-lifting-based L-LT (DL-LT) has. We show that D2L-LT with IRSE mostly outperforms conventional L-LTs in lossy-to-lossless image coding

    On the design of fast and efficient wavelet image coders with reduced memory usage

    Full text link
    Image compression is of great importance in multimedia systems and applications because it drastically reduces bandwidth requirements for transmission and memory requirements for storage. Although earlier standards for image compression were based on the Discrete Cosine Transform (DCT), a recently developed mathematical technique, called Discrete Wavelet Transform (DWT), has been found to be more efficient for image coding. Despite improvements in compression efficiency, wavelet image coders significantly increase memory usage and complexity when compared with DCT-based coders. A major reason for the high memory requirements is that the usual algorithm to compute the wavelet transform requires the entire image to be in memory. Although some proposals reduce the memory usage, they present problems that hinder their implementation. In addition, some wavelet image coders, like SPIHT (which has become a benchmark for wavelet coding), always need to hold the entire image in memory. Regarding the complexity of the coders, SPIHT can be considered quite complex because it performs bit-plane coding with multiple image scans. The wavelet-based JPEG 2000 standard is still more complex because it improves coding efficiency through time-consuming methods, such as an iterative optimization algorithm based on the Lagrange multiplier method, and high-order context modeling. In this thesis, we aim to reduce memory usage and complexity in wavelet-based image coding, while preserving compression efficiency. To this end, a run-length encoder and a tree-based wavelet encoder are proposed. In addition, a new algorithm to efficiently compute the wavelet transform is presented. This algorithm achieves low memory consumption using line-by-line processing, and it employs recursion to automatically place the order in which the wavelet transform is computed, solving some synchronization problems that have not been tackled by previous proposals. The proposed encodeOliver Gil, JS. (2006). On the design of fast and efficient wavelet image coders with reduced memory usage [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/1826Palanci

    The JPEG2000 still image compression standard

    Get PDF
    The development of standards (emerging and established) by the International Organization for Standardization (ISO), the International Telecommunications Union (ITU), and the International Electrotechnical Commission (IEC) for audio, image, and video, for both transmission and storage, has led to worldwide activity in developing hardware and software systems and products applicable to a number of diverse disciplines [7], [22], [23], [55], [56], [73]. Although the standards implicitly address the basic encoding operations, there is freedom and flexibility in the actual design and development of devices. This is because only the syntax and semantics of the bit stream for decoding are specified by standards, their main objective being the compatibility and interoperability among the systems (hardware/software) manufactured by different companies. There is, thus, much room for innovation and ingenuity. Since the mid 1980s, members from both the ITU and the ISO have been working together to establish a joint international standard for the compression of grayscale and color still images. This effort has been known as JPEG, the Join
    • …
    corecore