53 research outputs found

    EXPEDITION - An integrated approach to expose expedition information and research results

    Get PDF
    The portal EXPEDITION offers an integrative “one-stop-shop” framework for discovery and re-use of scientific content originating from research platforms operated by the Alfred Wegener Institute (AWI). This information sharing framework is designed for interoperability and can be extended to various information systems worldwide. The framework is based on open technologies and access is freely available for scientists, funding agencies and the public. Because AWI’s research is known to be focused on both Polar Regions, access to various ready-to-use data products from the Arctic Ocean, the Southern Ocean and Antarctic as well as AWI-operated observing networks will be offered

    Coastal research needs common data infrastructures

    Get PDF
    Marine and coastal research has developed from local and sporadic limited measurements to long-term surveys and monitoring campaigns - consequently to the data-intensive and integrative science of today. The linkage of data and access to data beyond disciplinary boundaries became essential for marine research and coastal management. Therefore sufficient national and international data infrastructures are fundamental for central and easy access to the variety of existing, but distributed datasets in marine and coastal research. The Marine Network for Integrated Data Access - MaNIDA provides a national networked approach in accessing and mining of federated marine research data infrastructures together with data management strategies and data workflows. In that course the consortium conceptualized and developed the a data portal for coherent discovery, view, download and dissemination of scientific data and publications. The Data Portal German Marine Research is based on a central harvesting and interfacing approach by connecting distributed data sources. Since the German network of content providers have distinct objectives and mandates for storing data and information (e.g. long‐term data preservation, near real time data, publication repositories), we have to cope with heterogeneous metadata in terms of syntax and semantic, data types and formats as well as access solutions. Therefore we defined a set of core metadata elements which are common to our content providers and useful for discovery and building relationships. Existing catalogues for various types of vocabularies are being used to assure the mapping to community‐wide used terms. The web application allows browsing by e.g. monitoring platforms, vessels and date for exploring data and research gaps. Data-related information is homogenously presented to the user and adaptable to specific disciplines. Data access and dissemination information is available as direct access or web services or data download link

    Newsletter of the Digital Earth Project Contributions of the Alfred Wegener Institute to Digital Earth

    Get PDF
    As an important technical pillar of Digital Earth AWI computing centre provides data management and cloud processing services to the project partners. We develop project specific extensions to the AWI data flow framework O2A (Observation to Archive). Sensor registration in O2A will support a flexible handling of sensors and their metadata, e.g. for the Digital Earth showcases, methane and soil moisture measurements are in focus for smart monitoring designs and for the access to data in near real time (NRT). Furthermore, data exploration is supported by a rasterdata manager service that can be easily coupled in user ́s data workflows with other data sources, like NRT sensor data. In the following we give more details on O2A, its components and concept

    Automatic data quality control for understanding extreme climate event

    Get PDF
    The understanding of extreme events strongly depends on knowledge gained from data. Data integration of mul-tiple sources, scales and earth compartments is the fo-cus of the project Digital Earth, which also join efforts on the quality control of data. Automatic quality control is embedded in the ingest component of the O2A, the ob-servation-to-archive data flow framework of the Alfred-Wegener-Institute. In that framework, the O2A-Sensor provides observation properties to the O2A-Ingest, which delivers quality-flagged data to the O2A-dash-board. The automatic quality control currently follows a procedural approach, where modules are included to implement formulations found in the literature and other operational observatory networks. A set of plausibility tests including range, spike and gradient tests are cur-rently operational. The automatic quality control scans the ingesting data in near-real-time (NRT) format, builds a table of devices, and search - either by absolute or derivative values - for correctness and validity of obser-vations. The availability of observation properties, for in-stance tests parameters like physical or operation ranges, triggers the automatic quality control, which in turn iterates through the table of devices to set the qual-ity flag for each sample and observation. To date, the quality flags in use are sequential and qualitative, i.e. it describes a level of quality in the data. A new flagging system is under development to include a descriptive characteristic that will comprise technical and user inter-pretation. Within Digital Earth, data on flood and drought events along the Elbe River and methane emissions in the North Sea are to be reviewed using automatic qual-ity control. Fast and scalable automatic quality control will disentangle uncertainty raised by quality issues and thus improve our understanding of extreme events in those cases

    An integrative solution for managing, tracing and citing sensor-related information

    Get PDF
    In a data-driven scientific world, the need to capture information on sensors used in the data acquisition process has become increasingly important. Following the recommendations of the Open Geospatial Consortium (OGC), we started by adopting the SensorML standard for describing platforms, devices and sensors. However, it soon became obvious to us that understanding, implementing and filling such standards costs significant effort and cannot be expected from every scientist individually. So we developed a web-based sensor management solution (https://sensor.awi.de) for describing platforms, devices and sensors as hierarchy of systems which supports tracing changes to a system whereas hiding complexity. Each platform contains devices where each device can have sensors associated with specific identifiers, contacts, events, related online resources (e.g. manufacturer factsheets, calibration documentation, data processing documentation), sensor output parameters and geo-location. In order to better understand and address real world requirements, we have closely interacted with field-going scientists in the context of the key national infrastructure project “FRontiers in Arctic marine Monitoring ocean observatory” (FRAM) during the software development. We learned that not only the lineage of observations is crucial for scientists but also alert services using value ranges, flexible output formats and information on data providers (e.g. FTP sources) for example. Mostly important, persistent and citable versions of sensor descriptions are required for traceability and reproducibility allowing seamless integration with existing information systems, e.g. PANGAEA. Within the context of the EU-funded Ocean Data Interoperability Platform project (ODIP II) and in cooperation with 52north we are proving near real-time data via Sensor Observation Services (SOS) along with sensor descriptions based on our sensor management solution. ODIP II also aims to develop a harmonized SensorML profile for the marine community which we will be adopting in our solution as soon as available. In this presentation we will show our sensor management solution which is embedded in our data flow framework to offer out-of-the-box interoperability with existing information systems and standards. In addition, we will present real world examples and challenges related to the description and traceability of sensor metadata

    O2A - Data Flow Framework from Sensor Observations to Archives

    Get PDF
    The Alfred Wegener Institute coordinates German polar research and is one of the most productive polar research institutions worldwide with scientists working in both Polar Regions – a task that can only be successful with the help of excellent infrastructure and logistics. Conducting research in the Arctic and Antarctic requires research stations staffed throughout the year as the basis for expeditions and data collection. It needs research vessels, aircrafts and long-term observatories for large-scale measurements as well as sophisticated technology. In this sense, the AWI also provides this infrastructure and competence to national and international partners. To meet the challenge the AWI has been progressively developing and sustaining an e-Infrastructure for coherent discovery, visualization, dissemination and archival of scientific information and data. Most of the data originates from research activities being carried out in a wide range of sea-, airand land-based operating research platforms. Archival and publishing in PANGAEA repository along with DOI assignment to individual datasets is a pursued end-of-line step. Within AWI, a workflow for data acquisition from vessel-mounted devices along with ingestion procedures for the raw data into the institutional archives has been well established. However, the increasing number of ocean-based stations and respective sensors along with heterogeneous project-driven requirements towards satellite communication, sensor monitoring, quality control and validation, processing algorithms, visualization and dissemination has recently lead us to build a more generic and cost-effective framework, hereafter named O2A (observations to archives). The main strengths of our framework (https://www.awi.de/en/data-flow) are the seamless flow of sensor observation to archives and the fact that it complies with internationally used OGC standards and assuring interoperability in international context (e.g. SOS/SWE, WMS, WFS, etc.). O2A comprises several extensible and exchangeable modules (e.g. controlled vocabularies and gazetteers, file type and structure validation, aggregation solutions, processing algorithms, etc.) as well as various interoperability services. We are providing integrated tools for standardized platform, device and sensor descriptions following SensorML (https://sensor.awi.de), automated near-real time and “big data” data streams supporting SOS and O&M and dashboards allowing data specialists to monitor their data streams for trends and early detection of malfunction of sensors (https://dashboard.awi.de). Also in the context of the “Helmholtz Data Federation” with outlook towards the European Open Science Cloud we are developing a cloud-based workspace providing user-friendly solutions for data storage on petabyte-scale and state-of-the-art computing solutions (Hadoop, Spark, Notebooks, rasdaman, etc.) to support scientists in collaborative data analysis and visualization activities including geo-information systems (http://maps.awi.de). Our affiliated repositories offer archival and long-term preservation as well as publication solutions for data, data products, publications, presentations and field reports (https://www.pangaea.de, https://epic.awi.de)
    • 

    corecore