12 research outputs found

    Geospatial Standards and the Knowledge Generation Lifescycle

    Get PDF
    Standards play an essential role at each stage in the sequence of processes by which knowledge is generated from geoscience observations, simulations and analysis. This paper provides an introduction to the field of informatics and the knowledge generation lifecycle in the context of the geosciences. In addition we discuss how the newly formed Earth Science Informatics Technical Committee is helping to advance the application of standards and best practices to make data and data systems more usable and interoperable

    Call to action for global access to and harmonization of quality information of individual earth science datasets

    Get PDF
    Knowledge about the quality of data and metadata is important to support informed decisions on the (re)use of individual datasets and is an essential part of the ecosystem that supports open science. Quality assessments reflect the reliability and usability of data. They need to be consistently curated, fully traceable, and adequately documented, as these are crucial for sound decision-and policy-making efforts that rely on data. Quality assessments also need to be consistently represented and readily integrated across systems and tools to allow for improved sharing of information on quality at the dataset level for individual quality attribute or dimension. Although the need for assessing the quality of data and associated information is well recognized, methodologies for an evaluation framework and presentation of resultant quality information to end users may not have been comprehensively addressed within and across disciplines. Global interdisciplinary domain experts have come together to systematically explore needs, challenges and impacts of consistently curating and representing quality information through the entire lifecycle of a dataset. This paper describes the findings of that effort, argues the importance of sharing dataset quality information, calls for community action to develop practical guidelines, and outlines community recommendations for developing such guidelines. Practical guidelines will allow for global access to and harmonization of quality information at the level of individual Earth science datasets, which in turn will support open science

    OpenAltimetry - rapid analysis and visualization of Spaceborne altimeter data.

    No full text
    NASA's Ice, Cloud, and land Elevation Satellite-2 (ICESat-2) carries a laser altimeter that fires 10,000 pulses per second towards Earth and records the travel time of individual photons to measure the elevation of the surface below. The volume of data produced by ICESat-2, nearly a TB per day, presents significant challenges for users wishing to efficiently explore the dataset. NASA's National Snow and Ice Data Center (NSIDC) Distributed Active Archive Center (DAAC), which is responsible for archiving and distributing ICESat-2 data, provides search and subsetting services on mission data products, but providing interactive data discovery and visualization tools needed to assess data coverage and quality in a given area of interest is outside of NSIDC's mandate. The OpenAltimetry project, a NASA-funded collaboration between NSIDC, UNAVCO and the University of California San Diego, has developed a web-based cyberinfrastructure platform that allows users to locate, visualize, and download ICESat-2 surface elevation data and photon clouds for any location on Earth, on demand. OpenAltimetry also provides access to elevations and waveforms for ICESat (the predecessor mission to ICESat-2). In addition, OpenAltimetry enables data access via APIs, opening opportunities for rapid access, experimentation, and computation via third party applications like Jupyter notebooks. OpenAltimetry emphasizes ease-of-use for new users and rapid access to entire altimetry datasets for experts and has been successful in meeting the needs of different user groups. In this paper we describe the principles that guided the design and development of the OpenAltimetry platform and provide a high-level overview of the cyberinfrastructure components of the system

    Remote sensing and GIS technology in the Global Land Ice Measurements from Space (GLIMS) project

    Full text link
    Global Land Ice Measurements from Space (GLIMS) is an international consortium established to acquire satellite images of the world’s glaciers, analyze them for glacier extent and changes, and to assess these change data in terms of forcings. The consortium is organized into a system of Regional Centers, each of which is responsible for glaciers in their region of expertise. Specialized needs for mapping glaciers in a distributed analysis environment require considerable work developing software tools: terrain classification emphasizing snow, ice, water, and admixtures of ice with rock debris; change detection and analysis; visualization of images and derived data; interpretation and archival of derived data; and analysis to ensure consistency of results from different Regional Centers. A global glacier database has been designed and implemented at the National Snow and Ice Data Center (Boulder, CO); parameters have been expanded from those of the World Glacier Inventory (WGI), and the database has been structured to be compatible with (and to incorporate) WGI data. The project as a whole was originated, and has been coordinated by, the US Geological Survey (Flagstaff, AZ), which has also led the development of an interactive tool for automated analysis and manual editing of glacier images and derived data (GLIMSView). This article addresses remote sensing and Geographic Information Science techniques developed within the framework of GLIMS in order to fulfill the goals of this distributed project. Sample applications illustrating the developed techniques are also shown

    Introduction: Global Glacier Monitoring—a Long-Term Task Integrating in Situ Observations and Remote Sensing

    Full text link
    This book focuses on the complexities of glaciers as documented via satellite observations. The complexities drive much scientific interest in the subject. The essence—that the world’s glaciers and ice caps exhibit overwhelming retreat—is also developed by this book. In this introductory chapter, we aim at providing the reader with background information to better understand the integration of the glacier-mapping initiative known as Global Land Ice Measurements from Space (GLIMS, http://www.glims.org ) within the framework of internationally coordinated glacier-monitoring activities. The chapter begins with general definitions of perennial ice on land and its global coverage, followed by a section on the relation between glaciers and climate. Brief overviews on the specific history of internationally coordinated glacier monitoring and the global monitoring strategy for glaciers and ice caps are followed by a summary of available data. We introduce the potential and challenges of satellite remote sensing for glacier monitoring in the 21st century and emphasize the importance of integrative change assessments. Lastly, we provide a synopsis of the book structure as well as some concluding remarks on worldwide glacier monitoring

    Data de- and re-dimensioning for optimized brokering access

    No full text
    <p>Data brokering systems aim to facilitate the exchange of data and models between disciplines in an increasingly transparent manner thereby accelerating scientific discovery. Researchers from many different, yet complimentary geoscience disciplines need to access cross field datasets from different fields with significantly different data formats, and in most cases differing time and space dimensionality than their own field commonly uses. This causes problems with large datasets with different time and space dimensions, as the difference in dimension often means that the entire dataset has to be read in order to provide the limited information the researcher is interested in. In this poster we present methods for removing the dimensionality from datasets, both physically on the data serving side as well demonstrate de- and redimensioning datasets from a broker based virtual perspective so the data brokering system can quickly access the smaller subset of data in the correct dimensionality for any given scientific field.</p> <p><strong>What we did</strong></p> <p>We de- and re-dimensioned the large reanalysis dataset, CFSR, to test alternative data paradigms to enhance the performance of single location extraction while maintain needed performace for spatial time step extraction.</p> <p><strong>Results</strong></p> <p>We were able to increase single location access performace by 10,000X, though spatial timestep access decreased by a factor of 10. Spatial requirements increased by a factor of 7 .</p> <p><strong>Discussion</strong></p> <p><em><strong>Cross Science Data Issues</strong></em></p> <p>Data brokering systems facilitate the exchange of data between disciplines, though the broker can not be responsible for optimizing the data structures for all sciences. An optimum data paradigm for one science is likely not efficient for other sciences, due to the native data dimensionalities of each field.</p> <p><em><strong>Benchmark Quality</strong></em></p> <p>While computational systems hosting each dataset are very dissimilar for this benchmark, it is fair to say that they are biased towards the CISL infrastructure.</p> <p><em><strong>Hosting Space Requirements</strong></em></p> <p>Other considerations that must be considered are the space requirements. Since the compression of shorter strings is less efficient than dense grids, and locational data has to be added to each point, the new paradigm for this study requires 4 times the storage space.</p> <p><em><strong>Re-dimensioned Usage</strong></em></p> <p>Over 22,000 single location and multi grid point requests have been made to TAMU and VTech Servers, estimated at ~ 3000 day equivalents from RDA subset service, which would not be possible with given computational resources.</p
    corecore