579 research outputs found

    Sixth Goddard Conference on Mass Storage Systems and Technologies Held in Cooperation with the Fifteenth IEEE Symposium on Mass Storage Systems

    Get PDF
    This document contains copies of those technical papers received in time for publication prior to the Sixth Goddard Conference on Mass Storage Systems and Technologies which is being held in cooperation with the Fifteenth IEEE Symposium on Mass Storage Systems at the University of Maryland-University College Inn and Conference Center March 23-26, 1998. As one of an ongoing series, this Conference continues to provide a forum for discussion of issues relevant to the management of large volumes of data. The Conference encourages all interested organizations to discuss long term mass storage requirements and experiences in fielding solutions. Emphasis is on current and future practical solutions addressing issues in data management, storage systems and media, data acquisition, long term retention of data, and data distribution. This year's discussion topics include architecture, tape optimization, new technology, performance, standards, site reports, vendor solutions. Tutorials will be available on shared file systems, file system backups, data mining, and the dynamics of obsolescence

    File System Simulation: Hierarchical Performance Measurement and Modeling

    Get PDF
    File systems are very important components in a computer system. File system simulation can help to predict the performance of new system designs. It offers the advantages of the flexibility of modeling and the cost and time savings of utilizing simulation instead of full implementation. Being able to predict end-to-end file system performance against a pre-defined workload can help system designers to make decisions that could affect their entire product line, involving several million dollars of investment. This dissertation presents detailed simulation-based performance models of the Linux ext3 file system and the PVFS parallel file system. The models are developed using Colored Petri Nets. A performance study, using the models, shows that the obtained results are close to the expected behavior of the real file system. The model shows that file system parameters have significant impact on the performance of the I/O when compared to the parameters of the disk subsystem

    Fifth NASA Goddard Conference on Mass Storage Systems and Technologies

    Get PDF
    This document contains copies of those technical papers received in time for publication prior to the Fifth Goddard Conference on Mass Storage Systems and Technologies held September 17 - 19, 1996, at the University of Maryland, University Conference Center in College Park, Maryland. As one of an ongoing series, this conference continues to serve as a unique medium for the exchange of information on topics relating to the ingestion and management of substantial amounts of data and the attendant problems involved. This year's discussion topics include storage architecture, database management, data distribution, file system performance and modeling, and optical recording technology. There will also be a paper on Application Programming Interfaces (API) for a Physical Volume Repository (PVR) defined in Version 5 of the Institute of Electrical and Electronics Engineers (IEEE) Reference Model (RM). In addition, there are papers on specific archives and storage products

    4Sensing - decentralized processing for participatory sensing data

    Get PDF
    Trabalho apresentado no âmbito do Mestrado em Engenharia Informática, como requisito parcial para obtenção do grau de Mestre em Engenharia Informática.Participatory sensing is a new application paradigm, stemming from both technical and social drives, which is currently gaining momentum as a research domain. It leverages the growing adoption of mobile phones equipped with sensors, such as camera, GPS and accelerometer, enabling users to collect and aggregate data, covering a wide area without incurring in the costs associated with a large-scale sensor network. Related research in participatory sensing usually proposes an architecture based on a centralized back-end. Centralized solutions raise a set of issues. On one side, there is the implications of having a centralized repository hosting privacy sensitive information. On the other side, this centralized model has financial costs that can discourage grassroots initiatives. This dissertation focuses on the data management aspects of a decentralized infrastructure for the support of participatory sensing applications, leveraging the body of work on participatory sensing and related areas, such as wireless and internet-wide sensor networks, peer-to-peer data management and stream processing. It proposes a framework covering a common set of data management requirements - from data acquisition, to processing, storage and querying - with the goal of lowering the barrier for the development and deployment of applications. Alternative architectural approaches - RTree, QTree and NTree - are proposed and evaluated experimentally in the context of a case-study application - SpeedSense - supporting the monitoring and prediction of traffic conditions, through the collection of speed and location samples in an urban setting, using GPS equipped mobile phones

    The Maunakea Spectroscopic Explorer Book 2018

    Full text link
    (Abridged) This is the Maunakea Spectroscopic Explorer 2018 book. It is intended as a concise reference guide to all aspects of the scientific and technical design of MSE, for the international astronomy and engineering communities, and related agencies. The current version is a status report of MSE's science goals and their practical implementation, following the System Conceptual Design Review, held in January 2018. MSE is a planned 10-m class, wide-field, optical and near-infrared facility, designed to enable transformative science, while filling a critical missing gap in the emerging international network of large-scale astronomical facilities. MSE is completely dedicated to multi-object spectroscopy of samples of between thousands and millions of astrophysical objects. It will lead the world in this arena, due to its unique design capabilities: it will boast a large (11.25 m) aperture and wide (1.52 sq. degree) field of view; it will have the capabilities to observe at a wide range of spectral resolutions, from R2500 to R40,000, with massive multiplexing (4332 spectra per exposure, with all spectral resolutions available at all times), and an on-target observing efficiency of more than 80%. MSE will unveil the composition and dynamics of the faint Universe and is designed to excel at precision studies of faint astrophysical phenomena. It will also provide critical follow-up for multi-wavelength imaging surveys, such as those of the Large Synoptic Survey Telescope, Gaia, Euclid, the Wide Field Infrared Survey Telescope, the Square Kilometre Array, and the Next Generation Very Large Array.Comment: 5 chapters, 160 pages, 107 figure

    Carbon-Dioxide Pipeline Infrastructure Route Optimization And Network Modeling For Carbon Capture Storage And Utilization

    Get PDF
    Carbon capture, utilization, and storage (CCUS) is a technology value-chain which can help reduce CO2 emissions while ensuring sustainable development of the energy and industrial sectors. However, CCUS requires large-scale deployment of infrastructure for capturing feasible amounts of CO2 that can be capital intensive for stakeholders. In addition, CCUS deployment leads to the development of extensive pipeline corridors, which can be inconsistent with the requirements for future CCUS infrastructure expansion. With the implementation and growth of CCUS technology in the states of North Dakota, Montana, Wyoming, Colorado and Utah in mind, this dissertation has two major goals: (a) to identify feasible corridors for CO2 pipelines; and (b) to develop a CCUS infrastructure network which minimizes project cost. To address these goals, the dissertation introduces the CCSHawk methodology that develops pipeline routes and CCUS infrastructure networks using a variety of techniques such as multi-criteria decision analysis (MCDA), graph network algorithms, natural language processing and linear network optimization. The pipeline route and CCUS network model are designed using open-source data, specifically: geo-information, emission quantities and reservoir properties. The MCDA of the study area reveals that North Dakota, central Wyoming and Eastern Colorado have the highest amount of land suitable for CO2 pipeline corridors. The optimized graph network routing algorithm reduces the overall length of pipeline routes by an average of 4.23% as compared to traditional routing algorithms while maintaining low environmental impact. The linear optimization of the CCUS infrastructure shows that the cost for implementing the technology in the study area can vary between 24.05/tCO2to24.05/tCO2 to 42/tCO2 for capturing 20 to 90MtCO2. The analysis also reveals that there would be a declining economic impact of existing pipeline infrastructure on the future growth of CCUS networks ranging between 0.01 to 1.62$/tCO2 with increasing CO2 capture targets. This research is significant, as it establishes a technique for pipeline route modeling and CCUS economic analysis highly adaptable to various geographic regions. To the best of the author\u27s knowledge, it is also the first economic analysis that considers the effect of pre-existing infrastructure on the growth of CCUS technology for the region. Furthermore, the pipeline route model establishes a schema for considering not only environmental factors but also ecological factors for the study area

    3D Spatial Data Infrastructures for web-based Visualization

    Get PDF
    In this thesis, concepts for developing Spatial Data Infrastructures with an emphasis on visualizing 3D landscape and city models in distributed environments are discussed. Spatial Data Infrastructures are important for public authorities in order to perform tasks on a daily basis, and serve as research topic in geo-informatics. Joint initiatives at national and international level exist for harmonizing procedures and technologies. Interoperability is an important aspect in this context - as enabling technology for sharing, distributing, and connecting geospatial data and services. The Open Geospatial Consortium is the main driver for developing international standards in this sector and includes government agencies, universities and private companies in a consensus process. 3D city models are becoming increasingly popular not only in desktop Virtual Reality applications but also for being used in professional purposes by public authorities. Spatial Data Infrastructures focus so far on the storage and exchange of 3D building and elevation data. For efficient streaming and visualization of spatial 3D data in distributed network environments such as the internet, concepts from the area of real time 3D Computer Graphics must be applied and combined with Geographic Information Systems (GIS). For example, scene graph data structures are commonly used for creating complex and dynamic 3D environments for computer games and Virtual Reality applications, but have not been introduced in GIS so far. In this thesis, several aspects of how to create interoperable and service-based environments for 3D spatial data are addressed. These aspects are covered by publications in journals and conference proceedings. The introductory chapter provides a logic succession from geometrical operations for processing raw data, to data integration patterns, to system designs of single components, to service interface descriptions and workflows, and finally to an architecture of a complete distributed service network. Digital Elevation Models are very important in 3D geo-visualization systems. Data structures, methods and processes are described for making them available in service based infrastructures. A specific mesh reduction method is used for generating lower levels of detail from very large point data sets. An integration technique is presented that allows the combination with 2D GIS data such as roads and land use areas. This approach allows using another optimization technique that greatly improves the usability for immersive 3D applications such as pedestrian navigation: flattening road and water surfaces. It is a geometric operation, which uses data structures and algorithms found in numerical simulation software implementing Finite Element Methods. 3D Routing is presented as a typical application scenario for detailed 3D city models. Specific problems such as bridges, overpasses and multilevel networks are addressed and possible solutions described. The integration of routing capabilities in service infrastructures can be accomplished with standards of the Open Geospatial Consortium. An additional service is described for creating 3D networks and for generating 3D routes on the fly. Visualization of indoor routes requires different representation techniques. As server interface for providing access to all 3D data, the Web 3D Service has been used and further developed. Integrating and handling scene graph data is described in order to create rich virtual environments. Coordinate transformations of scene graphs are described in detail, which is an important aspect for ensuring interoperability between systems using different spatial reference systems. The Web 3D Service plays a central part in nearly all experiments that have been carried out. It does not only provide the means for interactive web-visualizations, but also for performing further analyses, accessing detailed feature information, and for automatic content discovery. OpenStreetMap and other worldwide available datasets are used for developing a complete architecture demonstrating the scalability of 3D Spatial Data Infrastructures. Its suitability for creating 3D city models is analyzed, according to requirements set by international standards. A full virtual globe system has been developed based on OpenStreetMap including data processing, database storage, web streaming and a visualization client. Results are discussed and compared to similar approaches within geo-informatics research, clarifying in which application scenarios and under which requirements the approaches in this thesis can be applied

    Mu2e Technical Design Report

    Full text link
    The Mu2e experiment at Fermilab will search for charged lepton flavor violation via the coherent conversion process mu- N --> e- N with a sensitivity approximately four orders of magnitude better than the current world's best limits for this process. The experiment's sensitivity offers discovery potential over a wide array of new physics models and probes mass scales well beyond the reach of the LHC. We describe herein the preliminary design of the proposed Mu2e experiment. This document was created in partial fulfillment of the requirements necessary to obtain DOE CD-2 approval.Comment: compressed file, 888 pages, 621 figures, 126 tables; full resolution available at http://mu2e.fnal.gov; corrected typo in background summary, Table 3.
    corecore