15 research outputs found

    An investigative study of multispectral data compression for remotely-sensed images using vector quantization and difference-mapped shift-coding

    Get PDF
    A study is conducted to investigate the effects and advantages of data compression techniques on multispectral imagery data acquired by NASA's airborne scanners at the Stennis Space Center. The first technique used was vector quantization. The vector is defined in the multispectral imagery context as an array of pixels from the same location from each channel. The error obtained in substituting the reconstructed images for the original set is compared for different compression ratios. Also, the eigenvalues of the covariance matrix obtained from the reconstructed data set are compared with the eigenvalues of the original set. The effects of varying the size of the vector codebook on the quality of the compression and on subsequent classification are also presented. The output data from the Vector Quantization algorithm was further compressed by a lossless technique called Difference-mapped Shift-extended Huffman coding. The overall compression for 7 channels of data acquired by the Calibrated Airborne Multispectral Scanner (CAMS), with an RMS error of 15.8 pixels was 195:1 (0.41 bpp) and with an RMS error of 3.6 pixels was 18:1 (.447 bpp). The algorithms were implemented in software and interfaced with the help of dedicated image processing boards to an 80386 PC compatible computer. Modules were developed for the task of image compression and image analysis. Also, supporting software to perform image processing for visual display and interpretation of the compressed/classified images was developed

    The 1993 Space and Earth Science Data Compression Workshop

    Get PDF
    The Earth Observing System Data and Information System (EOSDIS) is described in terms of its data volume, data rate, and data distribution requirements. Opportunities for data compression in EOSDIS are discussed

    Remote Sensing Data Compression

    Get PDF
    A huge amount of data is acquired nowadays by different remote sensing systems installed on satellites, aircrafts, and UAV. The acquired data then have to be transferred to image processing centres, stored and/or delivered to customers. In restricted scenarios, data compression is strongly desired or necessary. A wide diversity of coding methods can be used, depending on the requirements and their priority. In addition, the types and properties of images differ a lot, thus, practical implementation aspects have to be taken into account. The Special Issue paper collection taken as basis of this book touches on all of the aforementioned items to some degree, giving the reader an opportunity to learn about recent developments and research directions in the field of image compression. In particular, lossless and near-lossless compression of multi- and hyperspectral images still remains current, since such images constitute data arrays that are of extremely large size with rich information that can be retrieved from them for various applications. Another important aspect is the impact of lossless compression on image classification and segmentation, where a reasonable compromise between the characteristics of compression and the final tasks of data processing has to be achieved. The problems of data transition from UAV-based acquisition platforms, as well as the use of FPGA and neural networks, have become very important. Finally, attempts to apply compressive sensing approaches in remote sensing image processing with positive outcomes are observed. We hope that readers will find our book useful and interestin

    On-Line Object Feature Extraction for Multispectral Scene Representation

    Get PDF
    This thesis investigates a new on-line unsupervised object-feature extraction method that reduces the complexity and costs associated with the analysis of the multispectral image data and the data transmission, storage, archival and distribution as well. Typically in remote sensing a scene is represented by the spatially disjoint pixel-oriented features. It would appear possible to reduce data redundancy by an on-line unsupervised object-feature extraction process, where combined spatial-spectral object\u27s features, rather than the original pixel-features, are used for multispectral scene representation. The ambiguity in the object detection process can be reduced if the spatial dependencies, which exist among the adjacent pixels, are intelligently incorporated into the decision making process. We define the unity relation that must exist among the pixels of an object. The unity relation can be constructed with regard to the: adjacency relation, spectral-feature and spatial-feature characteristics in an object; e.g. AMICA (Automatic Multispectral Image Compaction Algorithm) uses the within object pixel feature gradient vector as a valuable contextual information to construct the object\u27s features, which preserve the class separability information within the data. For on-line object extraction, we introduce the path-hypothesis, and the basic mathematical tools for its realization are introduced in terms of a specific similarity measure and adjacency relation. AMICA is an example of on-line preprocessing algorithm that uses unsupervised object feature extraction to represent the information in the multispectral image data more efficiently. As the data are read into the system sequentially, the algorithm partitions the observation space into an exhaustive set of disjoint objects simultaneously with the data acquisition process, where, pixels belonging to an object form a path-segment in the spectral space. Each path-segment is characterized by an object-feature set. Then, the set of object-features, rather than the original pixel-features, is used for data analysis and data classification. AMICA is applied to several sets of real image data, and the performance and reliability of features is evaluated. Example results show an average compaction coefficient of more than 20/1 (this factor is data dependent). The classification performance is improved slightly by using object-features rather than the original data, and the CPU time required for classification is reduced by a factor of more than 20 as well. The feature extraction process may be implemented in real time, thus the object-feature extraction CPU time is neglectable; however, in the simulated satellite environment the CPU time for this process is less than 15% of CPU time for original data classification

    Computing Fast and Scalable Table Cartograms for Large Tables

    Get PDF
    Given an m x n table T of positive weights and a rectangle R with an area equal to the sum of the weights, a table cartogram computes a partition of R into m x n convex quadrilateral faces such that each face has the same adjacencies as its corresponding cell in T, and has an area equal to the cell's weight. In this thesis, we explored different table cartogram algorithms for a large table with thousands of cells and investigated the potential applications of large table cartograms. We implemented Evans et al.'s table cartogram algorithm that guarantees zero area error and adapted a diffusion-based cartographic transformation approach, FastFlow, to produce large table cartograms. We introduced a constraint optimization-based table cartogram generation technique, TCarto, leveraging the concept of force-directed layout. We implemented TCarto with column-based and quadtree-based parallelization to compute table cartograms for table with thousands of cells. We presented several potential applications of large table cartograms to create the diagrammatic representations in various real-life scenarios, e.g., for analyzing spatial correlations between geospatial variables, understanding clusters and densities in scatterplots, and creating visual effects in images (i.e., expanding illumination, mosaic art effect). We presented an empirical comparison among these three table cartogram techniques with two different real-life datasets: a meteorological weather dataset and a US State-to-State migration flow dataset. FastFlow and TCarto both performed well on the weather data table. However, for US State-to-State migration flow data, where the table contained many local optima with high value differences among adjacent cells, FastFlow generated concave quadrilateral faces. We also investigated some potential relationships among different measurement metrics such as cartographic error (accuracy), the average aspect ratio (the readability of the visualization), computational speed, and the grid size of the table. Furthermore, we augmented our proposed TCarto with angle constraint to enhance the readability of the visualization, conceding some cartographic error, and also inspected the potential relationship of the restricted angles with the accuracy and the readability of the visualization. In the output of the angle constrained TCarto algorithm on US State-to-State migration dataset, it was difficult to identify the rows and columns for a cell upto 20 degree angle constraint, but appeared to be identifiable for more than 40 degree angle constraint

    1994 Science Information Management and Data Compression Workshop

    Get PDF
    This document is the proceedings from the 'Science Information Management and Data Compression Workshop,' which was held on September 26-27, 1994, at the NASA Goddard Space Flight Center, Greenbelt, Maryland. The Workshop explored promising computational approaches for handling the collection, ingestion, archival and retrieval of large quantities of data in future Earth and space science missions. It consisted of eleven presentations covering a range of information management and data compression approaches that are being or have been integrated into actual or prototypical Earth or space science data information systems, or that hold promise for such an application. The workshop was organized by James C. Tilton and Robert F. Cromp of the NASA Goddard Space Flight Center

    The 1995 Science Information Management and Data Compression Workshop

    Get PDF
    This document is the proceedings from the 'Science Information Management and Data Compression Workshop,' which was held on October 26-27, 1995, at the NASA Goddard Space Flight Center, Greenbelt, Maryland. The Workshop explored promising computational approaches for handling the collection, ingestion, archival, and retrieval of large quantities of data in future Earth and space science missions. It consisted of fourteen presentations covering a range of information management and data compression approaches that are being or have been integrated into actual or prototypical Earth or space science data information systems, or that hold promise for such an application. The Workshop was organized by James C. Tilton and Robert F. Cromp of the NASA Goddard Space Flight Center
    corecore