2,768 research outputs found

    Fast Search Approaches for Fractal Image Coding: Review of Contemporary Literature

    Get PDF
    Fractal Image Compression FIC as a model was conceptualized in the 1989 In furtherance there are numerous models that has been developed in the process Existence of fractals were initially observed and depicted in the Iterated Function System IFS and the IFS solutions were used for encoding images The process of IFS pertaining to any image constitutes much lesser space for recording than the actual image which has led to the development of representation the image using IFS form and how the image compression systems has taken shape It is very important that the time consumed for encoding has to be addressed for achieving optimal compression conditions and predominantly the inputs that are shared in the solutions proposed in the study depict the fact that despite of certain developments that has taken place still there are potential chances of scope for improvement From the review of exhaustive range of models that are depicted in the model it is evident that over period of time numerous advancements have taken place in the FCI model and is adapted at image compression in varied levels This study focus on the existing range of literature on FCI and the insights of various models has been depicted in this stud

    CompaCT: Fractal-Based Heuristic Pixel Segmentation for Lossless Compression of High-Color DICOM Medical Images

    Full text link
    Medical image compression is a widely studied field of data processing due to its prevalence in modern digital databases. This domain requires a high color depth of 12 bits per pixel component for accurate analysis by physicians, primarily in the DICOM format. Standard raster-based compression of images via filtering is well-known; however, it remains suboptimal in the medical domain due to non-specialized implementations. This study proposes a lossless medical image compression algorithm, CompaCT, that aims to target spatial features and patterns of pixel concentration for dynamically enhanced data processing. The algorithm employs fractal pixel traversal coupled with a novel approach of segmentation and meshing between pixel blocks for preprocessing. Furthermore, delta and entropy coding are applied to this concept for a complete compression pipeline. The proposal demonstrates that the data compression achieved via fractal segmentation preprocessing yields enhanced image compression results while remaining lossless in its reconstruction accuracy. CompaCT is evaluated in its compression ratios on 3954 high-color CT scans against the efficiency of industry-standard compression techniques (i.e., JPEG2000, RLE, ZIP, PNG). Its reconstruction performance is assessed with error metrics to verify lossless image recovery after decompression. The results demonstrate that CompaCT can compress and losslessly reconstruct medical images, being 37% more space-efficient than industry-standard compression systems.Comment: (8/24/2023) v1a: 16 pages, 9 figures, Word PD

    Deep Pipeline Architecture for Fast Fractal Color Image Compression Utilizing Inter-Color Correlation

    Get PDF
    Fractal compression technique is a well-known technique that encodes an image by mapping the image into itself and this requires performing a massive and repetitive search. Thus, the encoding time is too long, which is the main problem of the fractal algorithm. To reduce the encoding time, several hardware implementations have been developed. However, they are generally developed for grayscale images, and using them to encode colour images leads to doubling the encoding time 3× at least. Therefore, in this paper, new high-speed hardware architecture is proposed for encoding RGB images in a short time. Unlike the conventional approach of encoding the colour components similarly and individually as a grayscale image, the proposed method encodes two of the colour components by mapping them directly to the most correlated component with a searchless encoding scheme, while the third component is encoded with a search-based scheme. This results in reducing the encoding time and also in increasing the compression rate. The parallel and deep-pipelining approaches have been utilized to improve the processing time significantly. Furthermore, to reduce the memory access to the half, the image is partitioned in such a way that half of the matching operations utilize the same data fetched for processing the other half of the matching operations. Consequently, the proposed architecture can encode a 1024×1024 RGB image within a minimal time of 12.2 ms, and a compression ratio of 46.5. Accordingly, the proposed architecture is further superior to the state-of-the-art architectures.©2022 The Authors. Published by IEEE. This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/fi=vertaisarvioitu|en=peerReviewed

    Fractal Analysis

    Get PDF
    Fractal analysis is becoming more and more common in all walks of life. This includes biomedical engineering, steganography and art. Writing one book on all these topics is a very difficult task. For this reason, this book covers only selected topics. Interested readers will find in this book the topics of image compression, groundwater quality, establishing the downscaling and spatio-temporal scale conversion models of NDVI, modelling and optimization of 3T fractional nonlinear generalized magneto-thermoelastic multi-material, algebraic fractals in steganography, strain induced microstructures in metals and much more. The book will definitely be of interest to scientists dealing with fractal analysis, as well as biomedical engineers or IT engineers. I encourage you to view individual chapters

    Bitplanes Block Based Exact Image Compression

    Get PDF
    Abstract: In this paper, an exact image compression based on bit-planes blocking is proposed. The proposed algorithm uses two bit codes for block representation. The codes represent the states of Unicode block and non-Unicode. The algorithm considers further division to non-Unicode block. The block division continues until the smallest block size which are kept as residuals. The smallest block size in the study is two by two. The main process of encoding consumed three codes. Subsequent process uses the fourth code for further compression. The resultant file is subject to further exact compression. The compression technique considered in this study is Huffman. The compression-decompression implementation complexity is comparable with the well-known methods. Also, the compression ratio for the algorithm is comparable with well-known methods. The algorithm parallelization is straightforward and dependent on number of planes. Within a plane, the process hardware realization is simple and does on require special hardware

    Significant medical image compression techniques: a review

    Get PDF
    Telemedicine applications allow the patient and doctor to communicate with each other through network services. Several medical image compression techniques have been suggested by researchers in the past years. This review paper offers a comparison of the algorithms and the performance by analysing three factors that influence the choice of compression algorithm, which are image quality, compression ratio, and compression speed. The results of previous research have shown that there is a need for effective algorithms for medical imaging without data loss, which is why the lossless compression process is used to compress medical records. Lossless compression, however, has minimal compression ratio efficiency. The way to get the optimum compression ratio is by segmentation of the image into region of interest (ROI) and non-ROI zones, where the power and time needed can be minimised due to the smaller scale. Recently, several researchers have been attempting to create hybrid compression algorithms by integrating different compression techniques to increase the efficiency of compression algorithms

    Survey of Hybrid Image Compression Techniques

    Get PDF
    A compression process is to reduce or compress the size of data while maintaining the quality of information contained therein. This paper presents a survey of research papers discussing improvement of various hybrid compression techniques during the last decade. A hybrid compression technique is a technique combining excellent properties of each group of methods as is performed in JPEG compression method. This technique combines lossy and lossless compression method to obtain a high-quality compression ratio while maintaining the quality of the reconstructed image. Lossy compression technique produces a relatively high compression ratio, whereas lossless compression brings about high-quality data reconstruction as the data can later be decompressed with the same results as before the compression. Discussions of the knowledge of and issues about the ongoing hybrid compression technique development indicate the possibility of conducting further researches to improve the performance of image compression method
    • …
    corecore