6,463 research outputs found

    A review of the internet of floods : near real-time detection of a flood event and its impact

    Get PDF
    Worldwide, flood events frequently have a dramatic impact on urban societies. Time is key during a flood event in order to evacuate vulnerable people at risk, minimize the socio-economic, ecologic and cultural impact of the event and restore a society from this hazard as quickly as possible. Therefore, detecting a flood in near real-time and assessing the risks relating to these flood events on the fly is of great importance. Therefore, there is a need to search for the optimal way to collect data in order to detect floods in real time. Internet of Things (IoT) is the ideal method to bring together data of sensing equipment or identifying tools with networking and processing capabilities, allow them to communicate with one another and with other devices and services over the Internet to accomplish the detection of floods in near real-time. The main objective of this paper is to report on the current state of research on the IoT in the domain of flood detection. Current trends in IoT are identified, and academic literature is examined. The integration of IoT would greatly enhance disaster management and, therefore, will be of greater importance into the future

    Science for Disaster Risk Reduction

    Get PDF
    This thematic report describes JRC's activities in support to disaster management. The JRC develops tools and methodologies to help in all phases of disaster management, from preparedness and risk assessment to recovery and reconstruction through to forecasting and early warning.JRC.A.6-Communicatio

    Coastal management and adaptation: an integrated data-driven approach

    Get PDF
    Coastal regions are some of the most exposed to environmental hazards, yet the coast is the preferred settlement site for a high percentage of the global population, and most major global cities are located on or near the coast. This research adopts a predominantly anthropocentric approach to the analysis of coastal risk and resilience. This centres on the pervasive hazards of coastal flooding and erosion. Coastal management decision-making practices are shown to be reliant on access to current and accurate information. However, constraints have been imposed on information flows between scientists, policy makers and practitioners, due to a lack of awareness and utilisation of available data sources. This research seeks to tackle this issue in evaluating how innovations in the use of data and analytics can be applied to further the application of science within decision-making processes related to coastal risk adaptation. In achieving this aim a range of research methodologies have been employed and the progression of topics covered mark a shift from themes of risk to resilience. The work focuses on a case study region of East Anglia, UK, benefiting from the input of a partner organisation, responsible for the region’s coasts: Coastal Partnership East. An initial review revealed how data can be utilised effectively within coastal decision-making practices, highlighting scope for application of advanced Big Data techniques to the analysis of coastal datasets. The process of risk evaluation has been examined in detail, and the range of possibilities afforded by open source coastal datasets were revealed. Subsequently, open source coastal terrain and bathymetric, point cloud datasets were identified for 14 sites within the case study area. These were then utilised within a practical application of a geomorphological change detection (GCD) method. This revealed how analysis of high spatial and temporal resolution point cloud data can accurately reveal and quantify physical coastal impacts. Additionally, the research reveals how data innovations can facilitate adaptation through insurance; more specifically how the use of empirical evidence in pricing of coastal flood insurance can result in both communication and distribution of risk. The various strands of knowledge generated throughout this study reveal how an extensive range of data types, sources, and advanced forms of analysis, can together allow coastal resilience assessments to be founded on empirical evidence. This research serves to demonstrate how the application of advanced data-driven analytical processes can reduce levels of uncertainty and subjectivity inherent within current coastal environmental management practices. Adoption of methods presented within this research could further the possibilities for sustainable and resilient management of the incredibly valuable environmental resource which is the coast

    Google earth engine as multi-sensor open-source tool for supporting the preservation of archaeological areas: The case study of flood and fire mapping in metaponto, italy

    Get PDF
    In recent years, the impact of Climate change, anthropogenic and natural hazards (such as earthquakes, landslides, floods, tsunamis, fires) has dramatically increased and adversely affected modern and past human buildings including outstanding cultural properties and UNESCO heritage sites. Research about protection/monitoring of cultural heritage is crucial to preserve our cultural properties and (with them also) our history and identity. This paper is focused on the use of the open-source Google Earth Engine tool herein used to analyze flood and fire events which affected the area of Metaponto (southern Italy), near the homonymous Greek-Roman archaeological site. The use of the Google Earth Engine has allowed the supervised and unsupervised classification of areas affected by flooding (2013–2020) and fire (2017) in the past years, obtaining remarkable results and useful information for setting up strategies to mitigate damage and support the preservation of areas and landscape rich in cultural and natural heritage

    Potential and Limitations of Open Satellite Data for Flood Mapping

    Get PDF
    Satellite remote sensing is a powerful tool to map flooded areas. In recent years, the availability of free satellite data significantly increased in terms of type and frequency, allowing the production of flood maps at low cost around the world. In this work, we propose a semi-automatic method for flood mapping, based only on free satellite images and open-source software. The proposed methods are suitable to be applied by the community involved in flood hazard management, not necessarily experts in remote sensing processing. As case studies, we selected three flood events that recently occurred in Spain and Italy. Multispectral satellite data acquired by MODIS, Proba-V, Landsat, and Sentinel-2 and synthetic aperture radar (SAR) data collected by Sentinel-1 were used to detect flooded areas using different methodologies (e.g., Modified Normalized Difference Water Index, SAR backscattering variation, and supervised classification). Then, we improved and manually refined the automatic mapping using free ancillary data such as the digital elevation model-based water depth model and available ground truth data. We calculated flood detection performance (flood ratio) for the different datasets by comparing with flood maps made by official river authorities. The results show that it is necessary to consider different factors when selecting the best satellite data. Among these factors, the time of the satellite pass with respect to the flood peak is the most important. With co-flood multispectral images, more than 90% of the flooded area was detected in the 2015 Ebro flood (Spain) case study. With post-flood multispectral data, the flood ratio showed values under 50% a few weeks after the 2016 flood in Po and Tanaro plains (Italy), but it remained useful to map the inundated pattern. The SAR could detect flooding only at the co-flood stage, and the flood ratio showed values below 5% only a few days after the 2016 Po River inundation. Another result of the research was the creation of geomorphology-based inundation maps that matched up to 95% with official flood maps

    Seafloor characterization using airborne hyperspectral co-registration procedures independent from attitude and positioning sensors

    Get PDF
    The advance of remote-sensing technology and data-storage capabilities has progressed in the last decade to commercial multi-sensor data collection. There is a constant need to characterize, quantify and monitor the coastal areas for habitat research and coastal management. In this paper, we present work on seafloor characterization that uses hyperspectral imagery (HSI). The HSI data allows the operator to extend seafloor characterization from multibeam backscatter towards land and thus creates a seamless ocean-to-land characterization of the littoral zone

    Identifying success factors in crowdsourced geographic information use in government

    Get PDF
    Crowdsourcing geographic information in government is focusing on projects that are engaging people who are not government officials and employees in collecting, editing and sharing information with governmental bodies. This type of projects emerged in the past decade, due to technological and societal changes - such as the increased use of smartphones, combined with growing levels of education and technical abilities to use them by citizens. They also flourished due to the need for updated data in relatively quick time when financial resources are low. They range from recording the experience of feeling an earthquake to recording the location of businesses during the summer time. 50 cases of projects in which crowdsourced geographic information was used by governmental bodies across the world are analysed. About 60% of the cases were examined in 2014 and in 2017, to allow for comparison and identification of success and failure. The analysis looked at different aspects and their relationship to success: the drivers to start a project; scope and aims; stakeholders and relationships; inputs into the project; technical and organisational aspect; and problems encountered. The main key factors of the case studies were analysed with the use of Qualitative Comparative Analysis (QCA) which is an analytical method that combines quantitative and qualitative tools in sociological research. From the analysis, we can conclude that there is no “magic bullet” or a perfect methodology for a successful crowdsourcing in government project. Unless the organisation has reached maturity in the area of crowdsourcing, identifying a champion and starting a project that will not address authoritative datasets directly is a good way to ensure early success and start the process of organisational learning on how to run such projects. Governmental support and trust is undisputed. If the choice is to use new technologies, this should be accompanied by an investment of appropriate resources within the organisation to ensure that the investment bear fruits. Alternatively, using an existing technology that was successful elsewhere and investing in training and capacity building is another path for success. We also identified the importance of intermediary Non-Governmental Organizations (NGOs) with the experience and knowledge in working with crowdsourcing within a partnership. These organizations have the knowledge and skills to implement projects at the boundary between government and the crowd, and therefore can offer the experience to ensure better implementation. Changes and improvement of public services, or a focus on environmental monitoring can be a good basis for a project. Capturing base mapping is a good point to start, too. The recommendation of the report address organisational issues, resources, and legal aspects
    • …
    corecore