6 research outputs found

    An integrative solution for managing, tracing and citing sensor-related information

    Get PDF
    In a data-driven scientific world, the need to capture information on sensors used in the data acquisition process has become increasingly important. Following the recommendations of the Open Geospatial Consortium (OGC), we started by adopting the SensorML standard for describing platforms, devices and sensors. However, it soon became obvious to us that understanding, implementing and filling such standards costs significant effort and cannot be expected from every scientist individually. So we developed a web-based sensor management solution (https://sensor.awi.de) for describing platforms, devices and sensors as hierarchy of systems which supports tracing changes to a system whereas hiding complexity. Each platform contains devices where each device can have sensors associated with specific identifiers, contacts, events, related online resources (e.g. manufacturer factsheets, calibration documentation, data processing documentation), sensor output parameters and geo-location. In order to better understand and address real world requirements, we have closely interacted with field-going scientists in the context of the key national infrastructure project “FRontiers in Arctic marine Monitoring ocean observatory” (FRAM) during the software development. We learned that not only the lineage of observations is crucial for scientists but also alert services using value ranges, flexible output formats and information on data providers (e.g. FTP sources) for example. Mostly important, persistent and citable versions of sensor descriptions are required for traceability and reproducibility allowing seamless integration with existing information systems, e.g. PANGAEA. Within the context of the EU-funded Ocean Data Interoperability Platform project (ODIP II) and in cooperation with 52north we are proving near real-time data via Sensor Observation Services (SOS) along with sensor descriptions based on our sensor management solution. ODIP II also aims to develop a harmonized SensorML profile for the marine community which we will be adopting in our solution as soon as available. In this presentation we will show our sensor management solution which is embedded in our data flow framework to offer out-of-the-box interoperability with existing information systems and standards. In addition, we will present real world examples and challenges related to the description and traceability of sensor metadata

    SENSOR.awi.de: Management of heterogeneous platforms and sensors

    Get PDF
    SENSOR.awi.de is a component of our data flow framework designed to enable a semi-automated flow of sensor observations to archives (acronym O2A). The dramatic increase in the number and type of platforms and respective sensors operated by Alfred Wegener Institute along with complex project-driven requirements in terms of satellite communication, sensor monitoring, quality control and validation, processing pipelines, visualization, and archival under FAIR principles, led us to build a generic and cost-effective data flow framework. Most important, all components and services which make up this framework are extensible and exchangeble, were built using open access technologies (e.g. elastic search) and vocabularies (SeaVox NERC 2.0 vocabulary) and are compliant with various interoperability standards recommended by the international community. In this poster we illustrate the SENSOR.awi.de component which is the first step in the data acquisition process. In this component we have adopted the OGC standard SensorML 2.0 in oder to describe not only sensor-specific information (provenance/lineage metadata, data governance, physical characteristics, sensor positioning within the platform, parameter accuracy, etc) but also related events (e.g. station numbers as assigned in the data acquisition system on board along with actions such as deployment and recovery) and digital resources relevant for documenting the scientific workflows (e.g. calibration certificates). For this sake we have developed an AWI-specific SensorML profile and are sharing the model and as parters in the EU-funded project ODIP II we are contributing towards the generic Marine Sensor Profile. We have also been systematically sharing our experience in the RDA "Martina data Harmonization" Interest Group. In SENSOR.awi.de we are not only keen to describe sensors but also to create a sustainable identificaiton solution. Because the payload of various platforms change with time and sensor calibration may affect the data streams, it is important to keep track of these changes. For this sake we set up an audit trail history solution which allows scientists to create and identify an individual version/instance of a sensor. Each individual sensor instance and version gets assigned a handle as persistent identifier. These persistent identifiers enable the creation of citations for individual sensor instances at a given timestamp which can be used in publications and as part of the metadata associated with the final dataset and/or data product archived, e.g., in PANGAEA. To date, ~1300 sensor have been described in SENSOR.awi.de. Scientific Discipline/Research Area: Sensor Registry, Persistent Identification of Instruments, Provenance Metadata Relevance/Link to RDA: Input to/from RDA-Groups: "Persistent Identification of Instruments", "Data Citation" and "PID Kernel Information" Working Groups and "Vocabulary Services" Interest Group

    DASHBOARD.awi.de: Streaming and monitoring solutions for near real-time data

    Get PDF
    DASHBOARD.awi.de is a component of our data flow framework designed to enable a semi-automated flow of sensor observations to archives (acronym O2A). The dramatic increase in the number and type of platforms and respective sensors operated by Alfred Wegener Institute along with complex project-driven requirements in terms of satellite communication, sensor monitoring, quality control and validation, processing pipelines, visualization, and archival under FAIR principles, led us to build a generic and cost-effective data flow framework. Most important, all components and services which make up this framework are extensible and exchangeble, were built using open access technologies (e.g. elastic search) and vocabularies (SeaVox NERC 2.0 vocabulary) and are compliant with various interoperability standards recommended by the international community. In this poster we illustrate the DASHBOARD.awi.de component which is a web-based monitoring environment for supporting scientists in the graphing, mapping and simple analysis of time series. With a set of fit-for-purpose widgets including data download, scientists are able to identify gaps and outliers in the streamed data. Morover, we are in the process of building alerting solutions for individual parameters using the parameter properties available from SENSOR.awi.de (e.g. min/max parameter range). The streaming services attached to this component are using the near real-time data transfered from remote field sites to local databases and storage systems. For this sake, we are participating in the RDA "Array Database Assessment" Working Group and "Big Data" Interest Group. The graphing solutions built within DASHBOARD.awi.de can be easily re-used in other context (e.g. project web pages) . Our solutions support the OGC standard Observations and Measurements (O&M), JSON and CSV as exchange data format. To date, ~ 150 individual parameters are being transfered in near real-time from remote sites to our onshore storage systems and monitored by scientists. In the week of the 11th RDA Plenary Berlin, we will be able to interact with live data from Polarstern crossing the Drake Passage, on its way to Antarctica. Scientific Discipline/Research Area: Data flow, standardized vocabulary, Visualisation, Streaming. Relevance/Link to RDA: The streaming services attached to this component are using the near real-time data transfered from remote field sites to local databases and storage systems. For this sake, we are participating in the RDA "Array DB Assesment" Working Group and "Big Data" Interest Group

    DATA.awi.de: A one-stop-shop framework for discovery

    Get PDF
    DATA.awi.de is a component of our data flow framework designed to enable a semi-automated flow of sensor observations to archives (acronym O2A). The dramatic increase in the number and type of platforms and respective sensors operated by Alfred Wegener Institute along with complex project-driven requirements in terms of satellite communication, sensor monitoring, quality control and validation, processing pipelines, visualization, and archival under FAIR principles, led us to build a generic and cost-effective data flow framework. Most important, all components and services which make up this framework are extensible and exchangeble, were built using open access technologies (e.g. elastic search) and vocabularies (SeaVox NERC 2.0 vocabulary) and are compliant with various interoperability standards recommended by the international community. In this poster we illustrate the DATA.awi.de component which is a one-stop-shop framework for enabling discovery and dissemination of heterogeneous scientific information. Because the metadata and data generated and captured by the other O2A components are machine-readable and interoperable, we were able to build harvesting and indexing solutions which enable scientists and other stakeholders to discover content ranging from platforms/sensors, tracklines, field reports, near real-time data to quality-controlled data, map products and peer-reviewed publications. Scientific Disciplin/Research Area: Findability, Interoperability, Integration, Re-use. Relevance/Link to RDA: In the context of our harvesting approach, we are interacting with the RDA "Brokering Framework" Working Group and "Brokering" Interest Group

    MOSAiC goes O2A - Arctic Expedition Data Flow from Observations to Archives

    Get PDF
    During the largest polar expedition in history starting in September 2019, the German research icebreaker Polarstern spends a whole year drifting with the ice through the Arctic Ocean. The MOSAiC expedition takes the closest look ever at the Arctic even throughout the polar winter to gain fundamental insights and most unique on-site data for a better understanding of global climate change. Hundreds of researchers from 20 countries are involved. Scientists will use the in situ gathered data instantaneously in near-real time modus as well as long afterwards all around the globe taking climate research to a completely new level. Hence, proper data management, sampling strategies beforehand, and monitoring actual data flow as well as processing, analysis and sharing of data during and long after the MOSAiC expedition are the most essential tools for scientific gain and progress. To prepare for that challenge we adapted and integrated the research data management framework O2A “Data flow from Observations to Archives” to the needs of the MOSAiC expedition on board Polarstern as well as on land for data storage and access at the Alfred Wegener Institute Computing and Data Center in Bremerhaven, Germany. Our O2A-framework assembles a modular research infrastructure comprising a collection of tools and services. These components allow researchers to register all necessary sensor metadata beforehand linked to automatized data ingestion and to ensure and monitor data flow as well as to process, analyze, and publish data to turn the most valuable and uniquely gained arctic data into scientific outcomes. The framework further allows for the integration of data obtained with discrete sampling devices into the data flow. These requirements have led us to adapt the generic and cost-effective framework O2A to enable, control, and access the flow of sensor observations to archives in a cloud-like infrastructure on board Polarstern and later on to land based repositories for international availability. Major roadblocks of the MOSAiC-O2A data flow framework are (i) the increasing number and complexity of research platforms, devices, and sensors, (ii) the heterogeneous interdisciplinary driven requirements towards, e. g., satellite data, sensor monitoring, in situ sample collection, quality assessment and control, processing, analysis and visualization, and (iii) the demand for near real time analyses on board as well as on land with limited satellite bandwidth. The key modules of O2A's digital research infrastructure established by AWI are implementing the FAIR principles: SENSORWeb, to register sensor applications and sampling devices and capture controlled meta data before and alongside any measurements in the field Data ingest, allowing researchers to feed data into storage systems and processing pipelines in a prepared and documented way, at best in controlled near real-time data streams Dashboards allowing researchers to find and access data and share and collaborate among partners Workspace enabling researchers to access and use data with research software utilizing a cloud-based virtualized infrastructure that allows researchers to analyze massive amounts of data on the spot Archiving and publishing data via repositories and Digital Object Identifiers (DOI

    Data management in MOSAiC – Challenges of the Multidisciplinary drifting Observatory for the Study of Arctic Climate

    No full text
    During the MOSAiC expedition, the German research icebreaker Polarstern spends a full year drifting through the Arctic Ocean. Scientists from 20 countries participate in the largest polar expedition in history exploring the Arctic climate system. The experiment covers a large suite of in-situ and remote sensing observations of physical, ecological and biogeochemical parameters to describe the processes coupling the atmosphere, sea ice, and ocean. In addition to forefront instrumentation and observational techniques, proper data management is essential for large and complex projects and field programs. Key elements are agreements on consistent sampling strategies, the possibility to monitor the data flow, to facilitate near real-time processing, and analysis and sharing of data during and long after the expedition. Furthermore, data publication and documentation are crucial for such a collaborative effort and will build the legacy of the project and finally take climate science to the next level. We adapted our modular research data management framework O2A “Data flow from Observations to Archives” to meet the expedition requirements and ensure central data archival for generations to come. Researchers register all necessary sensor metadata beforehand. Essential metadata of scientific actions in the field are ingested immediately with the FloeNavi, a novel system enabling navigation on a drifting ice floe. O2A provides tools to automatize data ingestion, monitor the data flow and process, analyze and publish data. Integration of ship- and land-based components and a shared storage ensure seamless continuation of collaboration during and after the expedition laying the fundamentals for numerous data publications
    corecore