655 research outputs found

    Losslees compression of RGB color images

    Get PDF
    Although much work has been done toward developing lossless algorithms for compressing image data, most techniques reported have been for two-tone or gray-scale images. It is generally accepted that a color image can be easily encoded by using a gray-scale compression technique on each of the three accounts the substantial correlations that are present between color planes. Although several lossy compression schemes that exploit such correlations have been reported in the literature, we are not aware of any such techniques for lossless compression. Because of the difference in goals, the best way of exploiting redundancies for lossy and lossless compression can be, and usually are, very different. We propose and investigate a few lossless compression schemes for RGB color images. Both prediction schemes and error modeling schemes are presented that exploit inter-frame correlations. Implementation results on a test set of images yield significant improvements

    Automatic compression for image sets using a graph theoretical framework

    Get PDF
    x, 77 leaves ; 29 cm.A new automatic compression scheme that adapts to any image set is presented in this thesis. The proposed scheme requires no a priori knowledge on the properties of the image set. This scheme is obtained using a unified graph-theoretical framework that allows for compression strategies to be compared both theoretically and experimentally. This strategy achieves optimal lossless compression by computing a minimum spanning tree of a graph constructed from the image set. For lossy compression, this scheme is near-optimal and a performance guarantee relative to the optimal one is provided. Experimental results demonstrate that this compression strategy compares favorably to the previously proposed strategies, with improvements up to 7% in the case of lossless compression and 72% in the case of lossy compression. This thesis also shows that the choice of underlying compression algorithm is important for compressing image sets using the proposed scheme

    Discovering Regularity in Point Clouds of Urban Scenes

    Full text link
    Despite the apparent chaos of the urban environment, cities are actually replete with regularity. From the grid of streets laid out over the earth, to the lattice of windows thrown up into the sky, periodic regularity abounds in the urban scene. Just as salient, though less uniform, are the self-similar branching patterns of trees and vegetation that line streets and fill parks. We propose novel methods for discovering these regularities in 3D range scans acquired by a time-of-flight laser sensor. The applications of this regularity information are broad, and we present two original algorithms. The first exploits the efficiency of the Fourier transform for the real-time detection of periodicity in building facades. Periodic regularity is discovered online by doing a plane sweep across the scene and analyzing the frequency space of each column in the sweep. The simplicity and online nature of this algorithm allow it to be embedded in scanner hardware, making periodicity detection a built-in feature of future 3D cameras. We demonstrate the usefulness of periodicity in view registration, compression, segmentation, and facade reconstruction. The second algorithm leverages the hierarchical decomposition and locality in space of the wavelet transform to find stochastic parameters for procedural models that succinctly describe vegetation. These procedural models facilitate the generation of virtual worlds for architecture, gaming, and augmented reality. The self-similarity of vegetation can be inferred using multi-resolution analysis to discover the underlying branching patterns. We present a unified framework of these tools, enabling the modeling, transmission, and compression of high-resolution, accurate, and immersive 3D images

    Time-varying volume visualization

    Get PDF
    Volume rendering is a very active research field in Computer Graphics because of its wide range of applications in various sciences, from medicine to flow mechanics. In this report, we survey a state-of-the-art on time-varying volume rendering. We state several basic concepts and then we establish several criteria to classify the studied works: IVR versus DVR, 4D versus 3D+time, compression techniques, involved architectures, use of parallelism and image-space versus object-space coherence. We also address other related problems as transfer functions and 2D cross-sections computation of time-varying volume data. All the papers reviewed are classified into several tables based on the mentioned classification and, finally, several conclusions are presented.Preprin

    A Real Time Image Processing Subsystem: GEZGIN

    Get PDF
    In this study, a real-time image processing subsystem, GEZGIN, which is currently being developed for BILSAT-1, a 100kg class micro-satellite, is presented. BILSAT-1 is being constructed in accordance with a technology transfer agreement between TĂśBITAK-BILTEN (Turkey) and SSTL (UK) and planned to be placed into a 650 km sunsynchronous orbit in Summer 2003. GEZGIN is one of the two Turkish R&D payloads to be hosted on BILSAT-1. One of the missions of BILSAT-1 is constructing a Digital Elevation Model of Turkey using both multi-spectral and panchromatic imagers. Due to limited down-link bandwidth and on-board storage capacity, employment of a realtime image compression scheme is highly advantageous for the mission. GEZGIN has evolved as an implementation to achieve image compression tasks that would lead to an efficient utilization of both the down-link and on-board storage. The image processing on GEZGIN includes capturing of 4-band multi-spectral images of size 2048x2048 8- bit pixels, compressing them simultaneously with the new industry standard JPEG2000 algorithm and forwarding the compressed multi-spectral image to Solid State Data Recorders (SSDR) of BILSAT-1 for storage and down-link transmission. The mission definition together with orbital parameters impose a 6.5 seconds constraint on real-time image compression. GEZGIN meets this constraint by exploiting the parallelism among image processing units and assigning compute intensive tasks to dedicated hardware. The proposed hardware also allows for full reconfigurability of all processing units

    Lossless compression of hyperspectral images

    Get PDF
    Band ordering and the prediction scheme are the two major aspects of hyperspectral imaging which have been studied to improve the performance of the compression system. In the prediction module, we propose spatio-spectral prediction methods. Two non-linear spectral prediction methods have been proposed in this thesis. NPHI (Non-linear Prediction for Hyperspectral Images) is based on a band look-ahead technique wherein a reference band is included in the prediction of pixels in the current band. The prediction technique estimates the variation between the contexts of the two bands to modify the weights computed in the reference band to predict the pixels in the current band. EPHI (Edge-based Prediction for Hyperspectral Images) is the modified NPHI technique wherein an edge-based analysis is used to classify the pixels into edges and non-edges in order to perform the prediction of the pixel in the current band. Three ordering methods have been proposed in this thesis. The first ordering method computes the local and global features in each band to group the bands. The bands in each group are ordered by estimating the compression ratios achieved between the entire band in the group and then ordering them using Kruskal\u27s algorithm. The other two methods of ordering compute the compression ratios between b-neighbors in performing the band ordering

    Improving Embedded Image Coding Using Zero Block - Quad Tree

    Get PDF
    The traditional multi-bitstream approach to the heterogeneity issue is very constrained and inefficient under multi bit rate applications. The multi bitstream coding techniques allow partial decoding at a various resolution and quality levels. Several scalable coding algorithms have been proposed in the international standards over the past decade, but these former methods can only accommodate relatively limited decoding properties. To achieve efficient coding during image coding the multi resolution compression technique is been used. To exploit the multi resolution effect of image, wavelet transformations are devolved. Wavelet transformation decompose the image coefficients into their fundamental resolution, but the transformed coefficients are observed to be non-integer values resulting in variable bit stream. This transformation result in constraint bit rate application with slower operation. To overcome stated limitation, hierarchical tree based coding were implemented which exploit the relation between the wavelet scale levels and generate the code stream for transmission

    Detecting Poisoning Attacks on Hierarchical Malware Classification Systems

    Get PDF
    Anti-virus software based on unsupervised hierarchical clustering (HC) of malware samples has been shown to be vulnerable to poisoning attacks. In this kind of attack, a malicious player degrades anti-virus performance by submitting to the database samples specifically designed to collapse the classification hierarchy utilized by the anti-virus (and constructed through HC) or otherwise deform it in a way that would render it useless. Though each poisoning attack needs to be tailored to the particular HC scheme deployed, existing research seems to indicate that no particular HC method by itself is immune. We present results on applying a new notion of entropy for combinatorial dendrograms to the problem of controlling the influx of samples into the data base and deflecting poisoning attacks. In a nutshell, effective and tractable measures of change in hierarchy complexity are derived from the above, enabling on-the-fly flagging and rejection of potentially damaging samples. The information-theoretic underpinnings of these measures ensure their indifference to which particular poisoning algorithm is being used by the attacker, rendering them particularly attractive in this setting

    Implementation of MPEG-4s Subdivision Surfaces Tools

    Get PDF
    This work is about the implementation of a MPEG-4 decoder for subdivision surfaces, which are powerful 3D paradigms allowing to compactly represent piecewise smooth surfaces. This study will take place in the framework of MPEG-4 AFX, the extension of the MPEG-4 standard including the subdivision surfaces. This document will introduce, with some details, the theory of subdivision surfaces in the two forms present in MPEG-4: plain and detailed/ wavelet subdivision surfaces. It will particularly concentrate on wavelet subdivision surfaces, which permit progressive 3D mesh compression
    • …
    corecore