1,790 research outputs found

    Parallel Implementation of Lossy Data Compression for Temporal Data Sets

    Full text link
    Many scientific data sets contain temporal dimensions. These are the data storing information at the same spatial location but different time stamps. Some of the biggest temporal datasets are produced by parallel computing applications such as simulations of climate change and fluid dynamics. Temporal datasets can be very large and cost a huge amount of time to transfer among storage locations. Using data compression techniques, files can be transferred faster and save storage space. NUMARCK is a lossy data compression algorithm for temporal data sets that can learn emerging distributions of element-wise change ratios along the temporal dimension and encodes them into an index table to be concisely represented. This paper presents a parallel implementation of NUMARCK. Evaluated with six data sets obtained from climate and astrophysics simulations, parallel NUMARCK achieved scalable speedups of up to 8788 when running 12800 MPI processes on a parallel computer. We also compare the compression ratios against two lossy data compression algorithms, ISABELA and ZFP. The results show that NUMARCK achieved higher compression ratio than ISABELA and ZFP.Comment: 10 pages, HiPC 201

    A distributed Quadtree Dictionary approach to multi-resolution visualization of scattered neutron data

    Get PDF
    Grid computing is described as dependable, seamless, pervasive access to resources and services, whereas mobile computing allows the movement of people from place to place while staying connected to resources at each location. Mobile grid computing is a new computing paradigm, which joins these two technologies by enabling access to the collection of resources within a user\u27s virtual organization while still maintaining the freedom of mobile computing through a service paradigm. A major problem in virtual organization is needs mismatch, in which one resources requests a service from another resources it is unable to fulfill, since virtual organizations are necessarily heterogeneous collections of resources. In this dissertation we propose a solution to the needs mismatch problem in the case of high energy physics data. Specifically, we propose a Quadtree Dictionary (QTD) algorithm to provide lossless, multi-resolution compression of datasets and enable their visualization on devices of all capabilities. As a prototype application, we extend the Integrated Spectral Analysis Workbench (ISAW) developed at the Intense Pulsed Neutron Source Division of the Argonne National Laboratory into a mobile Grid application, Mobile ISAW. In this dissertation we compare our QTD algorithm with several existing compression techniques on ISAW\u27s Single-Crystal Diffractometer (SCD) datasets. We then extend our QTD algorithm to a distributed setting and examine its effectiveness on the next generation of SCD datasets. In both a serial and distributed setting, our QTD algorithm performs no worse than existing techniques such as the square wavelet transform in terms of energy conservation, while providing the worst-case savings of 8:1

    Adaptive Basis Scan by Wavelet Prediction for Single-Pixel Imaging

    Get PDF
    International audienceSingle pixel camera imaging is an emerging paradigm that allows high-quality images to be provided by a device only equipped with a single point detector. A single pixel camera is an experimental setup able to measure the inner product of the scene under view –the image– with any user-defined pattern. Post-processing a sequence of point measurements obtained with different patterns permits to recover spatial information, as it has been demonstrated by state-of-the art approaches belonging to the compressed sensing framework. In this paper, a new framework for the choice of the patterns is proposed together with a simple and efficient image recovery scheme. Our goal is to overcome the computationally demanding 1-minimization of compressed sensing. We propose to choose patterns among a wavelet basis in an adaptive fashion, which essentially relies onto the prediction of the significant wavelet coefficients' location. More precisely, we adopt a multiresolution strategy that exploits the set of measurements acquired at coarse scales to predict the set of measurements to be performed at a finer scale. Prediction is based on a fast cubic interpolation in the image domain. A general formalism is given so that any kind of wavelets can be used, which enables one to adjust the wavelet to the type of images related to the desired application. Both simulated and experimental results demonstrate the ability of our technique to reconstruct biomedical images with improved quality compared to CS-based recovery. Application to real-time fluorescence imaging of biological tissues could benefit from the proposed method

    CIRA annual report 2003-2004

    Get PDF

    Estimating the concentration of physico chemical parameters in hydroelectric power plant reservoir

    Get PDF
    The United Nations Educational, Scientific and Cultural Organization (UNESCO) defines the amazon region and adjacent areas, such as the Pantanal, as world heritage territories, since they possess unique flora and fauna and great biodiversity. Unfortunately, these regions have increasingly been suffering from anthropogenic impacts. One of the main anthropogenic impacts in the last decades has been the construction of hydroelectric power plants. As a result, dramatic altering of these ecosystems has been observed, including changes in water levels, decreased oxygenation and loss of downstream organic matter, with consequent intense land use and population influxes after the filling and operation of these reservoirs. This, in turn, leads to extreme loss of biodiversity in these areas, due to the large-scale deforestation. The fishing industry in place before construction of dams and reservoirs, for example, has become much more intense, attracting large populations in search of work, employment and income. Environmental monitoring is fundamental for reservoir management, and several studies around the world have been performed in order to evaluate the water quality of these ecosystems. The Brazilian Amazon, in particular, goes through well defined annual hydrological cycles, which are very importante since their study aids in monitoring anthropogenic environmental impacts and can lead to policy and decision making with regard to environmental management of this area. The water quality of amazon reservoirs is greatly influenced by this defined hydrological cycle, which, in turn, causes variations of microbiological, physical and chemical characteristics. Eutrophication, one of the main processes leading to water deterioration in lentic environments, is mostly caused by anthropogenic activities, such as the releases of industrial and domestic effluents into water bodies. Physico-chemical water parameters typically related to eutrophication are, among others, chlorophyll-a levels, transparency and total suspended solids, which can, thus, be used to assess the eutrophic state of water bodies. Usually, these parameters must be investigated by going out to the field and manually measuring water transparency with the use of a Secchi disk, and taking water samples to the laboratory in order to obtain chlorophyll-a and total suspended solid concentrations. These processes are time- consuming and require trained personnel. However, we have proposed other techniques to environmental monitoring studies which do not require fieldwork, such as remote sensing and computational intelligence. Simulations in different reservoirs were performed to determine a relationship between these physico-chemical parameters and the spectral response. Based on the in situ measurements, empirical models were established to relate the reflectance of the reservoir measured by the satellites. The images were calibrated and corrected atmospherically. Statistical analysis using error estimation was used to evaluate the most accurate methodology. The Neural Networks were trained by hydrological cycle, and were useful to estimate the physicalchemical parameters of the water from the reflectance of visible bands and NIR of satellite images, with better results for the period with few clouds in the regions analyzed. The present study shows the application of wavelet neural network to estimate water quality parameters using concentration of the water samples collected in the Amazon reservoir and Cefni reservoir, UK. Sattelite imagens from Landsats and Sentinel-2 were used to train the ANN by hydrological cycle. The trained ANNs demonstrated good results between observed and estimated after Atmospheric corrections in satellites images. The ANNs showed in the results are useful to estimate these concentrations using remote sensing and wavelet transform for image processing. Therefore, the techniques proposed and applied in the present study are noteworthy since they can aid in evaluating important physico-chemical parameters, which, in turn, allows for identification of possible anthropogenic impacts, being relevant in environmental management and policy decision-making processes. The tests results showed that the predicted values have good accurate. Improving efficiency to monitor water quality parameters and confirm the reliability and accuracy of the approaches proposed for monitoring water reservoirs. This thesis contributes to the evaluation of the accuracy of different methods in the estimation of physical-chemical parameters, from satellite images and artificial neural networks. For future work, the accuracy of the results can be improved by adding more satellite images and testing new neural networks with applications in new water reservoirs

    Multivariate Pointwise Information-Driven Data Sampling and Visualization

    Full text link
    With increasing computing capabilities of modern supercomputers, the size of the data generated from the scientific simulations is growing rapidly. As a result, application scientists need effective data summarization techniques that can reduce large-scale multivariate spatiotemporal data sets while preserving the important data properties so that the reduced data can answer domain-specific queries involving multiple variables with sufficient accuracy. While analyzing complex scientific events, domain experts often analyze and visualize two or more variables together to obtain a better understanding of the characteristics of the data features. Therefore, data summarization techniques are required to analyze multi-variable relationships in detail and then perform data reduction such that the important features involving multiple variables are preserved in the reduced data. To achieve this, in this work, we propose a data sub-sampling algorithm for performing statistical data summarization that leverages pointwise information theoretic measures to quantify the statistical association of data points considering multiple variables and generates a sub-sampled data that preserves the statistical association among multi-variables. Using such reduced sampled data, we show that multivariate feature query and analysis can be done effectively. The efficacy of the proposed multivariate association driven sampling algorithm is presented by applying it on several scientific data sets.Comment: 25 page

    Data Compression in Multi-Hop Large-Scale Wireless Sensor Networks

    Get PDF
    Data collection from a multi-hop large-scale outdoor WSN deployment for environmental monitoring is full of challenges due to the severe resource constraints on small battery-operated motes (e.g., bandwidth, memory, power, and computing capacity) and the highly dynamic wireless link conditions in an outdoor communication environment. We present a compressed sensing approach which can recover the sensing data at the sink with good accuracy when very few packets are collected, thus leading to a significant reduction of the network traffic and an extension of the WSN lifetime. Interplaying with the dynamic WSN routing topology, the proposed approach is efficient and simple to implement on the resource-constrained motes without motes storing of a part of random measurement matrix, as opposed to other existing compressed sensing based schemes. We provide a systematic method via machine learning to find a suitable representation basis, for the given WSN deployment and data field, which is both sparse and incoherent with the measurement matrix in the compressed sensing. We validate our approach and evaluate its performance using our real-world multi-hop WSN testbed deployment in situ in collecting the humidity and soil moisture data. The results show that our approach significantly outperforms three other compressed sensing based algorithms regarding the data recovery accuracy for the entire WSN observation field under drastically reduced communication costs. For some WSN scenarios, compressed sensing may not be applicable. Therefore we also design a generalized predictive coding framework for unified lossless and lossy data compression. In addition, we devise a novel algorithm for lossless compression to significantly improve data compression performance for variouSs data collections and applications in WSNs. Rigorous simulations show our proposed framework and compression algorithm outperform several recent popular compression algorithms for wireless sensor networks such as LEC, S-LZW and LTC using various real-world sensor data sets, demonstrating the merit of the proposed framework for unified temporal lossless and lossy data compression in WSNs

    A Review of Structural Health Monitoring Techniques as Applied to Composite Structures.

    Get PDF
    Structural Health Monitoring (SHM) is the process of collecting, interpreting, and analysing data from structures in order to determine its health status and the remaining life span. Composite materials have been extensively use in recent years in several industries with the aim at reducing the total weight of structures while improving their mechanical properties. However, composite materials are prone to develop damage when subjected to low to medium impacts (ie 1 – 10 m/s and 11 – 30 m/s respectively). Hence, the need to use SHM techniques to detect damage at the incipient initiation in composite materials is of high importance. Despite the availability of several SHM methods for the damage identification in composite structures, no single technique has proven suitable for all circumstances. Therefore, this paper offers some updated guidelines for the users of composites on some of the recent advances in SHM applied to composite structures; also, most of the studies reported in the literature seem to have concentrated on the flat composite plates and reinforced with synthetic fibre. There are relatively fewer stories on other structural configurations such as single or double curve structures and hybridised composites reinforced with natural and synthetic fibres as regards SHM
    • …
    corecore