560 research outputs found

    Citizen observatory based soil moisture monitoring – The GROW example

    Get PDF
    GROW Observatory is a project funded under the European Union’s Horizon 2020 research and innovation program. Its aim is to establish a large scale (more than 20,000 participants), resilient and integrated ‘Citizen Observatory’ (CO) and community for environmental monitoring that is self-sustaining beyond the life of the project. This article describes how the initial framework and tools were developed to evolve, bring together and train such a community; raising interest, engaging participants, and educating to support reliable observations, measurements and documentation, and considerations with a special focus on the reliability of the resulting dataset for scientific purposes. The scientific purposes of GROW observatory are to test the data quality and the spatial representativity of a citizen engagement driven spatial distribution as reliably inputs for soil moisture monitoring and to create timely series of gridded soil moisture products based on citizens’ observations using low cost soil moisture (SM) sensors, and to provide an extensive dataset of in situ soil moisture observations which can serve as a reference to validate satellite-based SM products and support the Copernicus in situ component. This article aims to showcase the initial steps of setting up such a monitoring network that has been reached at the mid-way point of the project’s funded period, focusing mainly on the design and development of the CO monitoring network

    Report from the Passive Microwave Data Set Management Workshop

    Get PDF
    Passive microwave data sets are some of the most important data sets in the Earth Observing System Data and Information System (EOSDIS), providing data as far back as the early 1970s. The widespread use of passive microwave (PM) radiometer data has led to their collection and distribution over the years at several different Earth science data centers. The user community is often confused by this proliferation and the uneven spread of information about the data sets. In response to this situation, a Passive Microwave Data Set Management Workshop was held 17 ]19 May 2011 at the Global Hydrology Resource Center, sponsored by the NASA Earth Science Data and Information System (ESDIS) Project. The workshop attendees reviewed all primary (Level 1 ]3) PM data sets from NASA and non ]NASA sensors held by NASA Distributed Active Archive Centers (DAACs), as well as high ]value data sets from other NASA ]funded organizations. This report provides the key findings and recommendations from the workshop as well as detailed tabluations of the datasets considered

    BQC: A free web service to quality control solar irradiance measurements across Europe

    Get PDF
    Classical quality control (QC) methods of solar irradiance apply easy-to-implement physical or statistical limits that are incapable of detecting low-magnitude measuring errors due to the large width of the intervals. We previously presented the bias-based quality control (BQC), a novel method that flags samples in which the bias of several independent gridded datasets is larger for consecutive days than the historical value. The BQC was previously validated at 313 European and 732 Spanish stations finding multiple low-magnitude errors (e.g., shadows, soiling) not detected by classical QC methods. However, the need for gridded datasets, and ground measurements to characterize the bias, was hindering the BQC implementation. To solve this issue, we present a free web service, www.bqcmethod.com, that implements the BQC algorithm incorporating both the gridded datasets and the reference stations required to use the BQC across Europe from 1983 to 2018. Users only have to upload a CSV file with the global horizontal irradiance measurements to be analyzed. Compared to previous BQC versions, gridded products have been upgraded to SARAH-2, CLARA-A2, ERA5, and the spatial coverage has been extended to all of Europe. The web service provides a flexible environment that allows users to tune the BQC parameters and upload ancillary rain data that help in finding the causes of the errors. Besides, the outputs cover not only the visual and numerical QC flags but also daily and hourly estimations from the gridded datasets, facilitating the access to raster data.We thank the Instituto de Estudios Riojanos for funding part of the web service within the program Estudios Científicos de Temática Riojana, Spain. This research used resources from the Supercomputing Castilla y Leon Center (SCAYLE, www.scayle.es), funded by the European Regional Development Fund (ERDF). We would also like to thank the EU meteorological networks that freely distribute their datasets and particularly those researchers who helped us in retrieving these data: Aku Riiëla and Anders Lindfors (FMI), Virginie Gorjoux (Météo France), Sandra Andersson (SHMI), and Jakub Walawender (IMGW-PIB). Finally, we thank the CMSAF and ECMWF for freely distributing their products, and particularly Jörg Trentmann, for providing a beta version of CLARA-A2.1. RU is a postdoc from the University of La Rioja working as a visiting scientist at the European Commission’s Joint Research Center (JRC). RU is funded by the Plan Propio de la Universidad de La Rioja, Spain and V Plan Riojano de I+D, Spain . The views expressed here are purely those of the authors and may not, under any circumstances, be regarded as an official position of the European Commission

    Managing uncertainty in integrated environmental modelling:the UncertWeb framework

    Get PDF
    Web-based distributed modelling architectures are gaining increasing recognition as potentially useful tools to build holistic environmental models, combining individual components in complex workflows. However, existing web-based modelling frameworks currently offer no support for managing uncertainty. On the other hand, the rich array of modelling frameworks and simulation tools which support uncertainty propagation in complex and chained models typically lack the benefits of web based solutions such as ready publication, discoverability and easy access. In this article we describe the developments within the UncertWeb project which are designed to provide uncertainty support in the context of the proposed ‘Model Web’. We give an overview of uncertainty in modelling, review uncertainty management in existing modelling frameworks and consider the semantic and interoperability issues raised by integrated modelling. We describe the scope and architecture required to support uncertainty management as developed in UncertWeb. This includes tools which support elicitation, aggregation/disaggregation, visualisation and uncertainty/sensitivity analysis. We conclude by highlighting areas that require further research and development in UncertWeb, such as model calibration and inference within complex environmental models

    The future of Earth observation in hydrology

    Get PDF
    In just the past 5 years, the field of Earth observation has progressed beyond the offerings of conventional space-agency-based platforms to include a plethora of sensing opportunities afforded by CubeSats, unmanned aerial vehicles (UAVs), and smartphone technologies that are being embraced by both for-profit companies and individual researchers. Over the previous decades, space agency efforts have brought forth well-known and immensely useful satellites such as the Landsat series and the Gravity Research and Climate Experiment (GRACE) system, with costs typically of the order of 1 billion dollars per satellite and with concept-to-launch timelines of the order of 2 decades (for new missions). More recently, the proliferation of smart-phones has helped to miniaturize sensors and energy requirements, facilitating advances in the use of CubeSats that can be launched by the dozens, while providing ultra-high (3-5 m) resolution sensing of the Earth on a daily basis. Start-up companies that did not exist a decade ago now operate more satellites in orbit than any space agency, and at costs that are a mere fraction of traditional satellite missions. With these advances come new space-borne measurements, such as real-time high-definition video for tracking air pollution, storm-cell development, flood propagation, precipitation monitoring, or even for constructing digital surfaces using structure-from-motion techniques. Closer to the surface, measurements from small unmanned drones and tethered balloons have mapped snow depths, floods, and estimated evaporation at sub-metre resolutions, pushing back on spatio-temporal constraints and delivering new process insights. At ground level, precipitation has been measured using signal attenuation between antennae mounted on cell phone towers, while the proliferation of mobile devices has enabled citizen scientists to catalogue photos of environmental conditions, estimate daily average temperatures from battery state, and sense other hydrologically important variables such as channel depths using commercially available wireless devices. Global internet access is being pursued via high-altitude balloons, solar planes, and hundreds of planned satellite launches, providing a means to exploit the "internet of things" as an entirely new measurement domain. Such global access will enable real-time collection of data from billions of smartphones or from remote research platforms. This future will produce petabytes of data that can only be accessed via cloud storage and will require new analytical approaches to interpret. The extent to which today's hydrologic models can usefully ingest such massive data volumes is unclear. Nor is it clear whether this deluge of data will be usefully exploited, either because the measurements are superfluous, inconsistent, not accurate enough, or simply because we lack the capacity to process and analyse them. What is apparent is that the tools and techniques afforded by this array of novel and game-changing sensing platforms present our community with a unique opportunity to develop new insights that advance fundamental aspects of the hydrological sciences. To accomplish this will require more than just an application of the technology: in some cases, it will demand a radical rethink on how we utilize and exploit these new observing systems

    Open data and interoperability standards : opportunities for animal welfare in extensive livestock systems

    Get PDF
    Extensive livestock farming constitutes a sizeable portion of agriculture, not only in relation to land use, but in contribution to feeding a growing human population. In addition to meat, it contributes other economically valuable commodities such as wool, hides and other products. The livestock industries are adopting technologies under the banner of Precision Livestock Farming (PLF) to help meet higher production and efficiency targets as well as help to manage the multiple challenges impacting the industries, such as climate change, environmental concerns, globalisation of markets, increasing rules of governance and societal scrutiny especially in relation to animal welfare. PLF is particularly dependent on the acquisition and management of data and metadata and on the interoperability standards that allow data discovery and federation. A review of interoperability standards and PLF adoption in extensive livestock farming systems identified a lack of domain specific standards and raised questions related to the amount and quality of public data which has potential to inform livestock farming. A systematic review of public datasets, which included an assessment based on the principles that data must be findable, accessible, interoperable and reusable (FAIR) was developed. Custom software scripts were used to conduct a dataset search to determine the quantity and quality of domain specific datasets yielded 419 unique Australian datasets directly related to extensive livestock farming. A FAIR assessment of these datasets using a set of non-domain specific, general metrics showed a moderate level of compliance. The results suggest that domain specific FAIR metrics may need to be developed to provide a more accurate data quality assessment, but also that the level of interoperability and reusability is not particularly high which has implications if public data is to be included in decision support tools. To test the usefulness of available public datasets in informing decision support in relation to livestock welfare, a case study was designed and farm animal welfare elements were extracted from Australian welfare standards to guide a dataset search. It was found that with few exceptions, these elements could be supported with public data, although there were gaps in temporal and spatial coverage. The development of a geospatial animal welfare portal including these datasets further explored and confirmed the potential for using public data to enhance livestock welfare.Doctor of Philosoph

    Couplers for linking environmental models: scoping study and potential next steps

    Get PDF
    This report scopes out what couplers there are available in the hydrology and atmospheric modelling fields. The work reported here examines both dynamic runtime and one way file based coupling. Based on a review of the peer-reviewed literature and other open sources, there are a plethora of coupling technologies and standards relating to file formats. The available approaches have been evaluated against criteria developed as part of the DREAM project. Based on these investigations, the following recommendations are made: • The most promising dynamic coupling technologies for use within BGS are OpenMI 2.0 and CSDMS (either 1.0 or 2.0) • Investigate the use of workflow engines: Trident and Pyxis, the latter as part of the TSB/AHRC project “Confluence” • There is a need to include database standards CSW and GDAL and use data formats from the climate community NetCDF and CF standards. • Development of a “standard” composition which will consist of two process models and a 3D geological model all linked to data stored in the BGS corporate database and flat file format. Web Feature Services should be included in these compositions. There is also a need to investigate other approaches in different disciplines: The Loss Modelling Framework, OASIS-LMF is the best candidate
    corecore