14 research outputs found

    Proceedings of the Scientific Data Compression Workshop

    Get PDF
    Continuing advances in space and Earth science requires increasing amounts of data to be gathered from spaceborne sensors. NASA expects to launch sensors during the next two decades which will be capable of producing an aggregate of 1500 Megabits per second if operated simultaneously. Such high data rates cause stresses in all aspects of end-to-end data systems. Technologies and techniques are needed to relieve such stresses. Potential solutions to the massive data rate problems are: data editing, greater transmission bandwidths, higher density and faster media, and data compression. Through four subpanels on Science Payload Operations, Multispectral Imaging, Microwave Remote Sensing and Science Data Management, recommendations were made for research in data compression and scientific data applications to space platforms

    The Space and Earth Science Data Compression Workshop

    Get PDF
    This document is the proceedings from a Space and Earth Science Data Compression Workshop, which was held on March 27, 1992, at the Snowbird Conference Center in Snowbird, Utah. This workshop was held in conjunction with the 1992 Data Compression Conference (DCC '92), which was held at the same location, March 24-26, 1992. The workshop explored opportunities for data compression to enhance the collection and analysis of space and Earth science data. The workshop consisted of eleven papers presented in four sessions. These papers describe research that is integrated into, or has the potential of being integrated into, a particular space and/or Earth science data information system. Presenters were encouraged to take into account the scientists's data requirements, and the constraints imposed by the data collection, transmission, distribution, and archival system

    Compression and protection of multidimensional data

    Get PDF
    2013 - 2014The main objective of this thesis is to explore and discuss novel techniques related to the compression and protection of multidimensional data (i.e., 3-D medical images, hyperspectral images, 3-D microscopy images and 5-D functional Magnetic Resonance Images). First, we outline a lossless compression scheme based on the predictive model, denoted as Medical Images Lossless Compression algorithm (MILC). MILC is characterized to provide a good trade-off between the compression performances and reduced usage of the hardware resources. Since in the medical and medical-related fields, the execution speed of an algorithm, could be a “critical” parameter, we investigate the parallelization of the compression strategy of the MILC algorithm, which is denoted as Parallel MILC. Parallel MILC can be executed on heterogeneous devices (i.e., CPUs, GPUs, etc.) and provides significant results in terms of speedup with respect to the MILC. This is followed by the important aspects related to the protection of two sensitive typologies of multidimensional data: 3-D medical images and 3-D microscopy images. Regarding the protection of 3-D medical images, we outline a novel hybrid approach, which allows for the efficient compression of 3-D medical images as well as the embedding of a digital watermark, at the same time. In relation to the protection of 3-D microscopy images, the simultaneous embedding of two watermarks is explained. It should be noted that 3-D microscopy images are often used in delicate tasks (i.e., forensic analysis, etc.). Subsequently, we review a novel predictive structure that is appropriate for the lossless compression of different typologies of multidimensional data... [edited by Author]XIII n.s

    The 1995 Science Information Management and Data Compression Workshop

    Get PDF
    This document is the proceedings from the 'Science Information Management and Data Compression Workshop,' which was held on October 26-27, 1995, at the NASA Goddard Space Flight Center, Greenbelt, Maryland. The Workshop explored promising computational approaches for handling the collection, ingestion, archival, and retrieval of large quantities of data in future Earth and space science missions. It consisted of fourteen presentations covering a range of information management and data compression approaches that are being or have been integrated into actual or prototypical Earth or space science data information systems, or that hold promise for such an application. The Workshop was organized by James C. Tilton and Robert F. Cromp of the NASA Goddard Space Flight Center

    Remote access computed tomography colonography

    Get PDF
    This thesis presents a novel framework for remote access Computed Tomography Colonography (CTC). The proposed framework consists of several integrated components: medical image data delivery, 2D image processing, 3D visualisation, and feedback provision. Medical image data sets are notoriously large and preserving the integrity of the patient data is essential. This makes real-time delivery and visualisation a key challenge. The main contribution of this work is the development of an efficient, lossless compression scheme to minimise the size of the data to be transmitted, thereby alleviating transmission time delays. The scheme utilises prior knowledge of anatomical information to divide the data into specific regions. An optimised compression method for each anatomical region is then applied. An evaluation of this compression technique shows that the proposed ‘divide and conquer’ approach significantly improves upon the level of compression achieved using more traditional global compression schemes. Another contribution of this work resides in the development of an improved volume rendering technique that provides real-time 3D visualisations of regions within CTC data sets. Unlike previous hardware acceleration methods which rely on dedicated devices, this approach employs a series of software acceleration techniques based on the characteristic properties of CTC data. A quantitative and qualitative evaluation indicates that the proposed method achieves real-time performance on a low-cost PC platform without sacrificing any image quality. Fast data delivery and real-time volume rendering represent the key features that are required for remote access CTC. These features are ultimately combined with other relevant CTC functionality to create a comprehensive, high-performance CTC framework, which makes remote access CTC feasible, even in the case of standard Web clients with low-speed data connections

    Remote Sensing Data Compression

    Get PDF
    A huge amount of data is acquired nowadays by different remote sensing systems installed on satellites, aircrafts, and UAV. The acquired data then have to be transferred to image processing centres, stored and/or delivered to customers. In restricted scenarios, data compression is strongly desired or necessary. A wide diversity of coding methods can be used, depending on the requirements and their priority. In addition, the types and properties of images differ a lot, thus, practical implementation aspects have to be taken into account. The Special Issue paper collection taken as basis of this book touches on all of the aforementioned items to some degree, giving the reader an opportunity to learn about recent developments and research directions in the field of image compression. In particular, lossless and near-lossless compression of multi- and hyperspectral images still remains current, since such images constitute data arrays that are of extremely large size with rich information that can be retrieved from them for various applications. Another important aspect is the impact of lossless compression on image classification and segmentation, where a reasonable compromise between the characteristics of compression and the final tasks of data processing has to be achieved. The problems of data transition from UAV-based acquisition platforms, as well as the use of FPGA and neural networks, have become very important. Finally, attempts to apply compressive sensing approaches in remote sensing image processing with positive outcomes are observed. We hope that readers will find our book useful and interestin

    Need for speed:Achieving fast image processing in acute stroke care

    Get PDF
    This thesis aims to investigate the use of high-performance computing (HPC) techniques in developing imaging biomarkers to support the clinical workflow of acute stroke patients. In the first part of this thesis, we evaluate different HPC technologies and how such technologies can be leveraged by different image analysis applications used in the context of acute stroke care. More specifically, we evaluated how computers with multiple computing devices can be used to accelerate medical imaging applications in Chapter 2. Chapter 3 proposes a novel data compression technique that efficiently processes CT perfusion (CTP) images in GPUs. Unfortunately, the size of CTP datasets makes data transfers to computing devices time-consuming and, therefore, unsuitable in acute situations. Chapter 4 further evaluates the algorithm's usefulness proposed in Chapter 3 with two different applications: a double threshold segmentation and a time-intensity profile similarity (TIPS) bilateral filter to reduce noise in CTP scans. Finally, Chapter 5 presents a cloud platform for deploying high-performance medical applications for acute stroke patients. In Part 2 of this thesis, Chapter 6 presents a convolutional neural network (CNN) for detecting and volumetric segmentation of subarachnoid hemorrhages (SAH) in non-contrast CT scans. Chapter 7 proposed another method based on CNNs to quantify the final infarct volumes in follow-up non-contrast CT scans from ischemic stroke patients

    Recent Advances in Signal Processing

    Get PDF
    The signal processing task is a very critical issue in the majority of new technological inventions and challenges in a variety of applications in both science and engineering fields. Classical signal processing techniques have largely worked with mathematical models that are linear, local, stationary, and Gaussian. They have always favored closed-form tractability over real-world accuracy. These constraints were imposed by the lack of powerful computing tools. During the last few decades, signal processing theories, developments, and applications have matured rapidly and now include tools from many areas of mathematics, computer science, physics, and engineering. This book is targeted primarily toward both students and researchers who want to be exposed to a wide variety of signal processing techniques and algorithms. It includes 27 chapters that can be categorized into five different areas depending on the application at hand. These five categories are ordered to address image processing, speech processing, communication systems, time-series analysis, and educational packages respectively. The book has the advantage of providing a collection of applications that are completely independent and self-contained; thus, the interested reader can choose any chapter and skip to another without losing continuity

    A survey of the application of soft computing to investment and financial trading

    Get PDF
    corecore