8,013 research outputs found

    Hyperspectral image compression : adapting SPIHT and EZW to Anisotropic 3-D Wavelet Coding

    Get PDF
    Hyperspectral images present some specific characteristics that should be used by an efficient compression system. In compression, wavelets have shown a good adaptability to a wide range of data, while being of reasonable complexity. Some wavelet-based compression algorithms have been successfully used for some hyperspectral space missions. This paper focuses on the optimization of a full wavelet compression system for hyperspectral images. Each step of the compression algorithm is studied and optimized. First, an algorithm to find the optimal 3-D wavelet decomposition in a rate-distortion sense is defined. Then, it is shown that a specific fixed decomposition has almost the same performance, while being more useful in terms of complexity issues. It is shown that this decomposition significantly improves the classical isotropic decomposition. One of the most useful properties of this fixed decomposition is that it allows the use of zero tree algorithms. Various tree structures, creating a relationship between coefficients, are compared. Two efficient compression methods based on zerotree coding (EZW and SPIHT) are adapted on this near-optimal decomposition with the best tree structure found. Performances are compared with the adaptation of JPEG 2000 for hyperspectral images on six different areas presenting different statistical properties

    Image compression based on 2D Discrete Fourier Transform and matrix minimization algorithm

    Get PDF
    In the present era of the internet and multimedia, image compression techniques are essential to improve image and video performance in terms of storage space, network bandwidth usage, and secure transmission. A number of image compression methods are available with largely differing compression ratios and coding complexity. In this paper we propose a new method for compressing high-resolution images based on the Discrete Fourier Transform (DFT) and Matrix Minimization (MM) algorithm. The method consists of transforming an image by DFT yielding the real and imaginary components. A quantization process is applied to both components independently aiming at increasing the number of high frequency coefficients. The real component matrix is separated into Low Frequency Coefficients (LFC) and High Frequency Coefficients (HFC). Finally, the MM algorithm followed by arithmetic coding is applied to the LFC and HFC matrices. The decompression algorithm decodes the data in reverse order. A sequential search algorithm is used to decode the data from the MM matrix. Thereafter, all decoded LFC and HFC values are combined into one matrix followed by the inverse DFT. Results demonstrate that the proposed method yields high compression ratios over 98% for structured light images with good image reconstruction. Moreover, it is shown that the proposed method compares favorably with the JPEG technique based on compression ratios and image quality

    Quality Adaptive Least Squares Trained Filters for Video Compression Artifacts Removal Using a No-reference Block Visibility Metric

    No full text
    Compression artifacts removal is a challenging problem because videos can be compressed at different qualities. In this paper, a least squares approach that is self-adaptive to the visual quality of the input sequence is proposed. For compression artifacts, the visual quality of an image is measured by a no-reference block visibility metric. According to the blockiness visibility of an input image, an appropriate set of filter coefficients that are trained beforehand is selected for optimally removing coding artifacts and reconstructing object details. The performance of the proposed algorithm is evaluated on a variety of sequences compressed at different qualities in comparison to several other deblocking techniques. The proposed method outperforms the others significantly both objectively and subjectively

    Enhancement Of Medical Image Compression Algorithm In Noisy WLANS Transmission

    Get PDF
    Advances in telemedicine technology enable rapid medical diagnoses with visualization and quantitative assessment by medical practitioners.In healthcare and hospital networks,medical data exchange-based wireless local area network (WLAN) transceivers remain challenging because of their growing data size,real-time contact with compressed images,and range of bandwidths requiring transmission support.Prior to transmission,medical data are compressed to minimize transmission bandwidth and save transmitting power.Researchers address many challenges in improving performance of compression approaches.Such challenges include energy compaction, computational complexity,high entropy value,drive low compression ratio (CR) and high computational complexity in real-time implementation.Thus,a new approach called Enhanced Independent Component Analysis (EICA) for medical image compression has been developed to boost compression techniques;which transform the image data by block-based Independent Component Analysis (ICA).The proposed method uses Fast Independent Component Analysis (FastICA) algorithm followed by developed quantization architecture based zero quantized coefficients percentage (ZQCP) prediction model using artificial neural network. For image reconstruction,decoding steps based the developed quantization architecture are examined.The EICA is particularly useful where the size of the transmitted data needs to be reduced to minimize the image transmission time.For data compression with suitable and effective performance,enhanced independent components analysis (EICA) is proposed as an algorithm for compression and decompression of medical data.A comparative analysis is performed based on existing data compression techniques:discrete cosine transform (DCT), set partitioning in hierarchical trees (SPIHT),and Joint Photographic Experts Group (JPEG 2000).Three main modules,namely,compression segment (CS),transceiver segment (TRS),and outcome segment (OTS) modules,are developed to realize a fully computerized simulation tool for medical data compression with suitable and effective performance.To compress medical data using algorithms,CS module involves four different approaches which are DCT, SPIHT,JPEG 2000 and EICA.TRS module is processed by low-cost WLANs with low-bandwidth transmission.Finally,OTS is used for data decompression and visualization result.In terms of compression module,results show the benefits of applying EICA in medical data compression and transmission.While for system design,the developed system displays favorable outcomes in compressing and transmitting medical data.In conclusion,all three modules (CS,TRS,and OTS) are integrated to yield a computerized prototype named as Medical Data Simulation System(Medata-SIM) computerized system that includes medical data compression and transceiver for visualization to aid medical practitioners in carrying out rapid diagnoses
    corecore