11 research outputs found

    mapping the suitability for ice core drilling of glaciers in the european alps and the asian high mountains

    Get PDF
    ABSTRACTIce cores from mid-latitude mountain glaciers provide detailed information on past climate conditions and regional environmental changes, which is essential for placing current climate change into a longer term perspective. In this context, it is important to define guidelines and create dedicated maps to identify suitable areas for future ice-core drillings. In this study, the suitability for ice-core drilling (SICD) of a mountain glacier is defined as the possibility of extracting an ice core with preserved stratigraphy suitable for reconstructing past climate. Morphometric and climatic variables related to SICD are selected through literature review and characterization of previously drilled sites. A quantitative Weight of Evidence method is proposed to combine selected variables (i.e. slope, local relief, temperature and direct solar radiation) to map the potential drilling sites in mid-latitude mountain glaciers. The method was first developed in the European Alps and then applied to the Asian High Mountains. Model performances and limitations are discussed and first indications of new potential drilling sites in the Asian High Mountains are provided. Results presented here can facilitate the selection of future drilling sites especially on unexplored Asian mountain glaciers towards the understanding of climate and environmental changes

    Boosting a Weather Monitoring System in Low Income Economies Using Open and Non-Conventional Systems: Data Quality Analysis

    No full text
    In low-income and developing countries, inadequate weather monitoring systems adversely affect the capacity of managing natural resources and related risks. Low-cost and IoT devices combined with a large diffusion of mobile connection and open technologies offer a possible solution to this problem. This research quantitatively evaluates the data quality of a non-conventional, low-cost and fully open system. The proposed novel solution was tested for a duration of 8 months, and the collected observations were compared with a nearby authoritative weather station. The experimental weather station is based in Arduino and transmits data through the 2G General Packet Radio Service (GPRS) to the istSOS which is a software to set-up a web service to collect, share and manage observations from sensor networks using the Sensor Observation Service (SOS) standard of the Open Geospatial Consortium (OGC). The results demonstrated that this accessible solution produces data of appropriate quality for natural resource and risk management

    4onse D1.3 - Project Identity Manual

    No full text
    <p>This document describes the corporate identity which has been developed for the 4onse project. The corporate identity consists of logo for the overall project and templates for written and presentation materials and printed communication materials.</p

    The Challenges of Reproducibility for Research Based on Geodata Web Services

    No full text
    Modern research applies the Open Science approach that fosters the production and sharing of Open Data according to the FAIR (Findable, Accessible, Interoperable, Reusable) principles. In the geospatial context this is generally achieved through the setup of OGC Web services that implements open standards that satisfies the FAIR requirements. Nevertheless, the requirement of Findability is not fully satisfied by those services since there&amp;rsquo;s no use of persistent identifiers and no guarantee that the same dataset used for a study can be immutably accessed in a later period: a fact that hinders the replicability of research. This is particularly true in recent years where data-driven research and technological advances have boosted frequent updates of datasets. Here, we review needs and practices, supported by some real case examples, on frequent data or metadata updates in geo-datasets of different data types. Additionally we assess the currently available tools that support data versioning for databases, files and log-structured tables. Finally we discuss challenges and opportunities to enable geospatial web services that are fully FAIR: a fact that would provide, due to the massive use and increasing availability of geospatial data, a great push toward open science compliance with ultimately impacts on the science transparency and credibility.</p

    Geospatial Webservices and Reproducibility of Research: Challenges and Needs

    No full text
    This article investigates challenges and requirements related to the reproducibility of geospatial research using geospatial web-services. Several researchers have identified hinders related to technology on the one hand, as well as challenges regarding existing well-known standards that respect FAIR principles (findable, accessible, interoperable, reusable). Therefore four hypotheses are established regarding reproducibility using geospatial webservices. These four hypotheses are addressed in an online survey. The results shows correlations between academic affiliations, open standards, and reproducibility in geospatial research

    Open and Cost-Effective Digital Ecosystem for Lake Water Quality Monitoring

    No full text
    In some sectors of the water resources management, the digital revolution process is slowed by some blocking factors such as costs, lack of digital expertise, resistance to change, etc. In addition, in the era of Big Data, many are the sources of information available in this field, but they are often not fully integrated. The adoption of different proprietary solutions to sense, collect and manage data is one of the main problems that hampers the availability of a fully integrated system. In this context, the aim of the project is to verify if a fully open, cost-effective and replicable digital ecosystem for lake monitoring can fill this gap and help the digitalization process using cloud based technology and an Automatic High-Frequency Monitoring System (AHFM) built using open hardware and software components. Once developed, the system is tested and validated in a real case scenario by integrating the historical databases and by checking the performance of the AHFM system. The solution applied the edge computing paradigm in order to move some computational work from server to the edge and fully exploiting the potential offered by low power consuming devices

    Automated high frequency monitoring of Lake Maggiore through <em>in situ</em> sensors: system design, field test and data quality control

    No full text
    A high frequency monitoring (HFM) system for the deep subalpine lakes Maggiore, Lugano and Como is under development within the EU INTERREG project SIMILE. The HFM system is designed to i) describe often neglected but potentially relevant processes occurring on short time scale; ii) become a cost-effective source of environmental data; and iii) strengthen the coordinated management of water resources in the subalpine lake district. In this project framework, a first HFM station (LM1) consisting of a monitoring buoy was placed in Lake Maggiore. LM1 represents a pilot experience within the project, aimed at providing the practical know-how needed for the development of the whole HFM system. To increase replicability and transferability, LM1 was developed in-house, and conceived as a low-cost modular system. LM1 is presently equipped with solar panels, a weather station, and sensors for water temperature, pH, dissolved oxygen, conductivity, and chlorophyll-a. In this study, we describe the main features of LM1 (hardware and software) and the adopted Quality Assurance/Quality Control (QA/QC) procedures. To this end, we provide examples from a test period, i.e., the first 9-months of functioning of LM1. A description of the software selected as data management software for the HFM system (IstSOS) is also provided. Data gathered during the study period provided clear evidence that coupling HFM and discrete sampling for QA/QC controls is necessary to produce accurate data and to detect and correct errors, mainly because of sensor fouling and calibration drift. These results also provide essential information to develop further the HFM system and shared protocols adapted to the local environmental (i.e., large subalpine lakes) and technical (expertise availability) context. Next challenge is making HFM not only a source of previously unaffordable information, but also a cost-effective tool for environmental monitoring
    corecore