4,106 research outputs found

    Modeling High-throughput Applications for in situ Analytics

    Get PDF
    International audienceWith the goal of performing exascale computing, the importance of I/Omanagement becomes more and more critical to maintain system performance.While the computing capacities of machines are getting higher, the I/O capa-bilities of systems do not increase as fast. We are able to generate more databut unable to manage them eciently due to variability of I/O performance.Limiting the requests to the Parallel File System (PFS) becomes necessary. Toaddress this issue, new strategies are being developed such as online in situanalysis. The idea is to overcome the limitations of basic post-mortem dataanalysis where the data have to be stored on PFS rst and processed later.There are several software solutions that allow users to specically dedicatenodes for analysis of data and distribute the computation tasks over dier-ent sets of nodes. Thus far, they rely on a manual resource partitioning andallocation by the user of tasks (simulations, analysis).In this work, we propose a memory-constraint modelization for in situ anal-ysis. We use this model to provide dierent scheduling policies to determineboth the number of resources that should be dedicated to analysis functions,and that schedule eciently these functions. We evaluate them and show theimportance of considering memory constraints in the model. Finally, we discussthe dierent challenges that have to be addressed in order to build automatictools for in situ analytics

    Views from the coalface: chemo-sensors, sensor networks and the semantic sensor web

    Get PDF
    Currently millions of sensors are being deployed in sensor networks across the world. These networks generate vast quantities of heterogeneous data across various levels of spatial and temporal granularity. Sensors range from single-point in situ sensors to remote satellite sensors which can cover the globe. The semantic sensor web in principle should allow for the unification of the web with the real-word. In this position paper, we discuss the major challenges to this unification from the perspective of sensor developers (especially chemo-sensors) and integrating sensors data in real-world deployments. These challenges include: (1) identifying the quality of the data; (2) heterogeneity of data sources and data transport methods; (3) integrating data streams from different sources and modalities (esp. contextual information), and (4) pushing intelligence to the sensor level

    ARMD Workshop on Materials and Methods for Rapid Manufacturing for Commercial and Urban Aviation

    Get PDF
    This report documents the goals, organization and outcomes of the NASA Aeronautics Research Mission Directorates (ARMD) Materials and Methods for Rapid Manufacturing for Commercial and Urban Aviation Workshop. The workshop began with a series of plenary presentations by leaders in the field of structures and materials, followed by concurrent symposia focused on forecasting the future of various technologies related to rapid manufacturing of metallic materials and polymeric matrix composites, referred to herein as composites. Shortly after the workshop, questionnaires were sent to key workshop participants from the aerospace industry with requests to rank the importance of a series of potential investment areas identified during the workshop. Outcomes from the workshop and subsequent questionnaires are being used as guidance for NASA investments in this important technology area

    Optimizing Lossy Compression Rate-Distortion from Automatic Online Selection between SZ and ZFP

    Full text link
    With ever-increasing volumes of scientific data produced by HPC applications, significantly reducing data size is critical because of limited capacity of storage space and potential bottlenecks on I/O or networks in writing/reading or transferring data. SZ and ZFP are the two leading lossy compressors available to compress scientific data sets. However, their performance is not consistent across different data sets and across different fields of some data sets: for some fields SZ provides better compression performance, while other fields are better compressed with ZFP. This situation raises the need for an automatic online (during compression) selection between SZ and ZFP, with a minimal overhead. In this paper, the automatic selection optimizes the rate-distortion, an important statistical quality metric based on the signal-to-noise ratio. To optimize for rate-distortion, we investigate the principles of SZ and ZFP. We then propose an efficient online, low-overhead selection algorithm that predicts the compression quality accurately for two compressors in early processing stages and selects the best-fit compressor for each data field. We implement the selection algorithm into an open-source library, and we evaluate the effectiveness of our proposed solution against plain SZ and ZFP in a parallel environment with 1,024 cores. Evaluation results on three data sets representing about 100 fields show that our selection algorithm improves the compression ratio up to 70% with the same level of data distortion because of very accurate selection (around 99%) of the best-fit compressor, with little overhead (less than 7% in the experiments).Comment: 14 pages, 9 figures, first revisio
    • 

    corecore