1,928 research outputs found

    Seafloor characterization using airborne hyperspectral co-registration procedures independent from attitude and positioning sensors

    Get PDF
    The advance of remote-sensing technology and data-storage capabilities has progressed in the last decade to commercial multi-sensor data collection. There is a constant need to characterize, quantify and monitor the coastal areas for habitat research and coastal management. In this paper, we present work on seafloor characterization that uses hyperspectral imagery (HSI). The HSI data allows the operator to extend seafloor characterization from multibeam backscatter towards land and thus creates a seamless ocean-to-land characterization of the littoral zone

    The PREVIEW Global Risk Data Platform: a geoportal to serve and share global data on risk to natural hazards

    Get PDF
    With growing world population and concentration in urban and coastal areas, the exposure to natural hazards is increasing and results in higher risk of human and economic losses. Improving the identification of areas, population and assets potentially exposed to natural hazards is essential to reduce the consequences of such events. Disaster risk is a function of hazard, exposure and vulnerability. Modelling risk at the global level requires accessing and processing a large number of data, from numerous collaborating centres. <br><br> These data need to be easily updated, and there is a need for centralizing access to this information as well as simplifying its use for non GIS specialists. The Hyogo Framework for Action provides the mandate for data sharing, so that governments and international development agencies can take appropriate decision for disaster risk reduction. <br><br> Timely access and easy integration of geospatial data are essential to support efforts in Disaster Risk Reduction. However various issues in data availability, accessibility and integration limit the use of such data. In consequence, a framework that facilitate sharing and exchange of geospatial data on natural hazards should improve decision-making process. The PREVIEW Global Risk Data Platform is a highly interactive web-based GIS portal supported by a Spatial Data Infrastructure that offers free and interoperable access to more than 60 global data sets on nine types of natural hazards (tropical cyclones and related storm surges, drought, earthquakes, biomass fires, floods, landslides, tsunamis and volcanic eruptions) and related exposure and risk. This application portrays an easy-to-use online interactive mapping interface so that users can easily work with it and seamlessly integrate data in their own data flow using fully compliant OGC Web Services (OWS)

    Mapping of risk web-platforms and risk data: collection of good practices

    Get PDF
    A successful DRR results from the combination of top-down, strategies, with bottom-up, methodological approaches. The top–down approach refers more to administrative directives, organizations, and operational skills linked with the management of the risk and reflects more the policy component. The bottom-up approach is linked to the analyse of the causal factors of disasters, including exposure to hazards, vulnerability, coping capacity, and reflects more the practice component. In the context of disaster science, policy and practice are often disconnected. This is evident in the dominant top-down DRM strategies utilizing global actions on one hand and the context specific nature of the bottom-up approach based on local action and knowledge. A way to bridge the gap between practice and policy is to develop a spatial data infrastructure of the type of GIS web-platforms based on risk mapping. It is a way of linking data information and decision support system (DSS) on a common ground that becomes a “battlefield of knowledge and actions”. This report presents the results of an overview of the risk web-platforms and related risk data used in risk assessment at the level of EU-28. It allows the discovery of the current advancement for risk web infrastructures and capabilities in order to establish a pool of good practices and detection of needs. The outcome of the overview shows the needs in risk web platform developments and tries to recommend capacities that should be prioritized in order to strengthen the link between risk data information and decision support system (DSS). The assessment is based on web search and outcome of diverse disaster risk workshops and conference.JRC.E.1-Disaster Risk Managemen

    Open-Access Geographic Sources And Data For The Study And Management Of Natural Resources

    Get PDF
    The objective of this systematic review is to describe and analyze open geographic data provided by governmental sources in order to provide an overview of open geographic sources and data for the study and management of natural resources in Peru. For this purpose, the web was explored and scientific articles were reviewed, finding a huge cartographic archive offered by the Peruvian State. On the one hand, public institutions put their respective geoportals into operation, and on the other hand, the launching into space of the PeruSAT-1 satellite. This increased the supply of official geospatial information in the last five years. In addition, geotechnical data was found in raw and processed form from global initiatives. All of this documentary collection is available to the public in an open, free and free form in cyberspace, which can be used in the study of the use, restoration, conservation and valuation of ecosystems and other elements of the environment

    Coastal management and adaptation: an integrated data-driven approach

    Get PDF
    Coastal regions are some of the most exposed to environmental hazards, yet the coast is the preferred settlement site for a high percentage of the global population, and most major global cities are located on or near the coast. This research adopts a predominantly anthropocentric approach to the analysis of coastal risk and resilience. This centres on the pervasive hazards of coastal flooding and erosion. Coastal management decision-making practices are shown to be reliant on access to current and accurate information. However, constraints have been imposed on information flows between scientists, policy makers and practitioners, due to a lack of awareness and utilisation of available data sources. This research seeks to tackle this issue in evaluating how innovations in the use of data and analytics can be applied to further the application of science within decision-making processes related to coastal risk adaptation. In achieving this aim a range of research methodologies have been employed and the progression of topics covered mark a shift from themes of risk to resilience. The work focuses on a case study region of East Anglia, UK, benefiting from the input of a partner organisation, responsible for the region’s coasts: Coastal Partnership East. An initial review revealed how data can be utilised effectively within coastal decision-making practices, highlighting scope for application of advanced Big Data techniques to the analysis of coastal datasets. The process of risk evaluation has been examined in detail, and the range of possibilities afforded by open source coastal datasets were revealed. Subsequently, open source coastal terrain and bathymetric, point cloud datasets were identified for 14 sites within the case study area. These were then utilised within a practical application of a geomorphological change detection (GCD) method. This revealed how analysis of high spatial and temporal resolution point cloud data can accurately reveal and quantify physical coastal impacts. Additionally, the research reveals how data innovations can facilitate adaptation through insurance; more specifically how the use of empirical evidence in pricing of coastal flood insurance can result in both communication and distribution of risk. The various strands of knowledge generated throughout this study reveal how an extensive range of data types, sources, and advanced forms of analysis, can together allow coastal resilience assessments to be founded on empirical evidence. This research serves to demonstrate how the application of advanced data-driven analytical processes can reduce levels of uncertainty and subjectivity inherent within current coastal environmental management practices. Adoption of methods presented within this research could further the possibilities for sustainable and resilient management of the incredibly valuable environmental resource which is the coast

    Digital Railway System

    Get PDF

    ICT for Disaster Risk Management:The Academy of ICT Essentials for Government Leaders

    Get PDF

    Big Data Management for Cloud-Enabled Geological Information Services

    Get PDF

    2017 DWH Long-Term Data Management Coordination Workshop Report

    Get PDF
    On June 7 and 8, 2017, the Coastal Response Research Center (CRRC)[1], NOAA Office of Response and Restoration (ORR) and NOAA National Marine Fisheries Service (NMFS) Restoration Center (RC), co-sponsored the Deepwater Horizon Oil Spill (DWH) Long Term Data Management (LTDM) workshop at the ORR Gulf of Mexico (GOM) Disaster Response Center (DRC) in Mobile, AL. There has been a focus on restoration planning, implementation and monitoring of the on-going DWH-related research in the wake of the DWH Natural Resource Damage Assessment (NRDA) settlement. This means that data management, accessibility, and distribution must be coordinated among various federal, state, local, non-governmental organizations (NGOs), academic, and private sector partners. The scope of DWH far exceeded any other spill in the U.S. with an immense amount of data (e.g., 100,000 environmental samples, 15 million publically available records) gathered during the response and damage assessment phases of the incident as well as data that continues to be produced from research and restoration efforts. The challenge with the influx in data is checking the quality, documenting data collection, storing data, integrating it into useful products, managing it and archiving it for long term use. In addition, data must be available to the public in an easily queried and accessible format. Answering questions regarding the success of the restoration efforts will be based on data generated for years to come. The data sets must be readily comparable, representative and complete; be collected using cross-cutting field protocols; be as interoperable as possible; meet standards for quality assurance/quality control (QA/QC); and be unhindered by conflicting or ambiguous terminology. During the data management process for the NOAA Natural Resource Damage Assessment (NRDA) for the DWH disaster, NOAA developed a data management warehouse and visualization system that will be used as a long term repository for accessing/archiving NRDA injury assessment data. This serves as a foundation for the restoration project planning and monitoring data for the next 15 or more years. The main impetus for this workshop was to facilitate public access to the DWH data collected and managed by all entities by developing linkages to or data exchanges among applicable GOM data management systems. There were 66 workshop participants (Appendix A) representing a variety of organizations who met at NOAA’s GOM Disaster Response Center (DRC) in order to determine the characteristics of a successful common operating picture for DWH data, to understand the systems that are currently in place to manage DWH data, and make the DWH data interoperable between data generators, users and managers. The external partners for these efforts include, but are not limited to the: RESTORE Council, Gulf of Mexico Research Initiative (GoMRI), Gulf of Mexico Research Initiative Information and Data Cooperative (GRIIDC), the National Academy of Sciences (NAS) Gulf Research Program, Gulf of Mexico Alliance (GOMA), and National Fish and Wildlife Foundation (NFWF). The workshop objectives were to: Foster collaboration among the GOM partners with respect to data management and integration for restoration planning, implementation and monitoring; Identify standards, protocols and guidance for LTDM being used by these partners for DWH NRDA, restoration, and public health efforts; Obtain feedback and identify next steps for the work completed by the Environmental Disasters Data Management (EDDM) Working Groups; and Work towards best practices on public distribution and access of this data. The workshop consisted of plenary presentations and breakout sessions. The workshop agenda (Appendix B) was developed by the organizing committee. The workshop presentations topics included: results of a pre-workshop survey, an overview of data generation, the uses of DWH long term data, an overview of LTDM, an overview of existing LTDM systems, an overview of data management standards/ protocols, results from the EDDM working groups, flow diagrams of existing data management systems, and a vision on managing big data. The breakout sessions included discussions of: issues/concerns for data stakeholders (e.g., data users, generators, managers), interoperability, ease of discovery/searchability, data access, data synthesis, data usability, and metadata/data documentation. [1] A list of acronyms is provided on Page 1 of this report
    • …
    corecore