2,975 research outputs found

    The Vulnerability of a City - Diagnosis from a Bird’s Eye View

    Get PDF
    When the tsunami in the Indian Ocean on 26 December 2004 hit the city of Banda Aceh on the island of Sumatra, Indonesia, neither the city administration nor its inhabitants, nor national or international organisations were prepared. Approximately 60.000 of the 260.000 inhabitants died, leaving other 30.000 homeless and causing an enormous impact on the local economy. In the aftermath of this event tsunami early warning system were developed and are operated today (e. g. the German Indonesian Tsunami Early Warning System – GITEWS (Lauterjung, 2005)). However, the problem of earthquake or tsunami prediction in a deterministic sense has not been solved yet (Zschau et al, 2002). Thus, an end-to-end tsunami early warning system includes not only the tsunami warning, but also the assessment of vulnerability, perception studies, evacuation modeling, eventually leading to technical requirements for monitoring stations and recommendations for adaptation and mitigation strategies (Taubenböck et al., 2009a). In this study we address several specific questions on the capabilities of one discipline – remote sensing – for diagnosing the multi-faceted and complex vulnerability of a city: • Which remotely sensed data sets are appropriate analyzing vulnerability in highly complex urban landscapes? • What capabilities and limitations does urban remote sensing have regarding mapping, analysis and assessment of risks and vulnerability? • How can interdisciplinary approaches extend the applicability of earth observation

    Standard Operating Procedure - Collaborative Spatial Assessment CoSA - Release 1.0

    Get PDF
    The purpose of this Standard Operating Procedure (SOP) is to establish uniform procedures pertaining to the preparation for, the performance of, and the reporting of COllaborative (geo) Spatial Assessment (CoSA). CoSA provides a synoptic, unbiased assessment over the impact area of a disaster, which feeds the two main recovery perspectives of the Post-Disaster Needs Assessment (PDNA): i) the valuation of damages and losses carried out through the Damage and Loss Assessment (DaLA) methodology; and ii) the identification of human impacts and recovery needs carried out though the Human Recovery Needs Assessment (HRNA). CoSA is distinct from other geospatial and remote sensing based assessments because it i) draws on the collaborative efforts of distributed capacities in remote sensing and geospatial analysis, ii) aims to achieve the highest possible accuracy in line with the requirements of the PDNA and iii) tries to do so under stringent timing constraints set by the PDNA schedule. The current SOP will aid in ensuring credibility, consistency, transparency, accuracy and completeness of the CoSA. It is a living document, however, that will be enriched with new practical experiences and regularly updated to incorporate state-of-the-art procedures and new technical developments.JRC.DG.G.2-Global security and crisis managemen

    An Approach to Developing a Spatio-Temporal Composite Measure of Climate Change-Related Human Health Impacts in Urban Environments

    Get PDF
    Introduction: Rapid population growth along with an increase in the frequency and intensity of climate change-related impacts in costal urban environments emphasize the need for the development of new tools to help disaster planners and policy makers select and prioritize mitigation and adaptation measures. Using the concept of the resilience of a community, which is a measure of how rapidly the community can recover to its previous level of functionality following a disruptive event is still a relatively new concept for many engineers, planners and policy makers, but is becoming recognized as an increasingly important and some would argue, essential component for the development and subsequent assessment of adaptation plans being considered for communities at risk of climate change-related events. The holistic approach which is the cornerstone of resilience is designed to integrate physical, economic, health, social and organizational impacts of climate change in urban environments. This research presents a methodology for the development of a quantitative spatial and temporal composite measure for assessing climate change-related health impacts in urban environments. Methods: The proposed method is capable of considering spatial and temporal data from multiple inputs, relating to both physical and social parameters. This approach uses inputs such as the total population density and densities of various demographics, burden of diseases conditions, flood inundation mapping, and land use change for both historical and current conditions. The research has demonstrated that the methodology presented generates sufficiently accurate information to be useful for planning adaptive strategies. To assemble all inputs into a single measure of health impacts, a weighting system was assigned to apply various priorities to the spatio-temporal data sources. Weights may be varied to assess how they impact the final results. Finally, using spatio-temporal extrapolation methods the future behavior of the same key spatial variables can be projected. Although this method was developed for application to any coastal mega-city, this thesis demonstrates the results obtained for Metro Vancouver, British Columbia, Canada. The data was collected for the years 1981, 1986, 1991, 1996, 2001, 2006 and 2011, as information was readily available for these years. Fine resolution spatial data for these years was used in order to give a dynamic simulation of possible health impacts for future projections. Linear and auto-regressive spatio-temporal extrapolations were used for projecting a 2050’s Metro Vancouver health impact map (HIM). Conclusion: Results of this work show that the approach provides a more fully integrated view of the resilience of the city which incorporates aspects of population health. The approach would be useful in the development of more targeted adaptation and risk reduction strategies at a local level. In addition, this methodology can be used to generate inputs for further resilience simulations. The overall value of this approach is that it allows for a more integrated assessment of the city vulnerability and could lead to more effective adaptive strategies

    Real-Time Social Network Data Mining For Predicting The Path For A Disaster

    Get PDF
    Traditional communication channels like news channels are not able to provide spontaneous information about disasters unlike social networks namely, Twitter. The present research work proposes a framework by mining real-time disaster data from Twitter to predict the path a disaster like a tornado will take. The users of Twitter act as the sensors which provide useful information about the disaster by posting first-hand experience, warnings or location of a disaster. The steps involved in the framework are – data collection, data preprocessing, geo-locating the tweets, data filtering and extrapolation of the disaster curve for prediction of susceptible locations. The framework is validated by analyzing the past events. This framework has the potential to be developed into a full-fledged system to predict and warn people about disasters. The warnings can be sent to news channels or broadcasted for pro-active action

    Assessing Building Vulnerability to Tsunami Hazard Using Integrative Remote Sensing and GIS Approaches

    Get PDF
    Risk and vulnerability assessment for natural hazards is of high interest. Various methods focusing on building vulnerability assessment have been developed ranging from simple approaches to sophisticated ones depending on the objectives of the study, the availability of data and technology. In-situ assessment methods have been widely used to measure building vulnerability to various types of hazards while remote sensing methods, specifically developed for assessing building vulnerability to tsunami hazard, are still very limited. The combination of remote sensing approaches with in-situ methods offers unique opportunities to overcome limitations of in-situ assessments. The main objective of this research is to develop remote sensing techniques in assessing building vulnerability to tsunami hazard as one of the key elements of risk assessment. The research work has been performed in the framework of the GITEWS (German-Indonesian Tsunami Early Warning System) project. This research contributes to two major components of tsunami risk assessment: (1) the provision of infrastructure vulnerability information as an important element in the exposure assessment; (2) tsunami evacuation modelling which is a critical element for assessing immediate response and capability to evacuate as part of the coping capacity analysis. The newly developed methodology is based on the combination of in-situ measurements and remote sensing techniques in a so-called “bottom-up remote sensing approach”. Within this approach, basic information was acquired by in-situ data collection (bottom level), which was then used as input for further analysis in the remote sensing approach (upper level). The results of this research show that a combined in-situ measurement and remote sensing approach can be successfully employed to assess and classify buildings into 4 classes based on their level of vulnerability to tsunami hazard with an accuracy of more than 80 percent. Statistical analysis successfully revealed key spatial parameters which were regarded to link parameters between in-situ and remote sensing approach such as size, height, shape, regularity, orientation, and accessibility. The key spatial parameters and their specified threshold values were implemented in a decision tree algorithm for developing a remote sensing rule-set of building vulnerability classification. A big number of buildings in the study area (Cilacap city, Indonesia) were successfully classified into the building vulnerability classes. The categorization ranges from high to low vulnerable buildings (A to C) and includes also a category of buildings which are potentially suitable for tsunami vertical evacuation (VE). A multi-criteria analysis was developed that incorporates three main components for vulnerability assessment: stability, tsunami resistance and accessibility. All the defined components were configured in a decision tree algorithm by applying weighting, scoring and threshold definition based on the building sample data. Stability components consist of structure parameters, which are closely related to the building stability against earthquake energy. Building stability needs to be analyzed because most of tsunami events in Indonesia are preceded by major earthquakes. Stability components analysis was applied in the first step of the newly developed decision tree algorithm to evaluate the building stability when earthquake strikes. Buildings with total scores below the defined threshold of stability were classified as the most vulnerable class A. Such the buildings have a high probability of being damaged after earthquake events. The remaining buildings with total scores above the defined threshold of stability were further analyzed using tsunami components and accessibility components to classify them into the vulnerability classes B, C and VE respectively. This research is based on very high spatial resolution satellite images (QuickBird) and object-based image analysis. Object-based image analysis is was chosen, because it allows the formulation of rule-sets based on image objects instead of pixels, which has significant advantages especially for the analysis of very high resolution satellite images. In the pre-processing stage, three image processing steps were performed: geometric correction, pan-sharpening and filtering. Adaptive Local Sigma and Morphological Opening filter techniques were applied as basis for the subsequent building edge detection. The data pre-processing significantly increased the accuracy of the following steps of image classification. In the next step image segmentation was developed to extract adequate image objects to be used for further classification. Image classification was carried out by grouping resulting objects into desired classes based on the derived object features. A single object was assigned by its feature characteristics calculated in the segmentation process. The characteristic features of an object - which were grouped into spectral signature, shape, size, texture, and neighbouring relations - were analysed, selected and semantically modelled to classify objects into object classes. Fuzzy logic algorithm and object feature separation analysis was performed to set the member¬ship values of objects that were grouped into particular classes. Finally this approach successfully detected and mapped building objects in the study area with their spatial attributes which provide base information for building vulnerability classification. A building vulnerability classification rule-set has been developed in this research and successfully applied to categorize building vulnerability classes. The developed approach was applied for Cilacap city, Indonesia. In order to analyze the transferability of this newly developed approach, the algorithm was also applied to Padang City, Indonesia. The results showed that the developed methodology is in general transferable. However, it requires some adaptations (e.g. thresholds) to provide accurate results. The results of this research show that Cilacap City is very vulnerable to tsunami hazard. Class A (very vulnerable) buildings cover the biggest portion of area in Cilacap City (63%), followed by class C (28%), class VE (6%) and class B (3%). Preventive measures should be carried out for the purpose of disaster risk reduction, especially for people living in such the most vulnerable buildings. Finally, the results were applied for tsunami evacuation modeling. The buildings, which were categorized as potential candidates for vertical evacuation, were selected and a GIS approach was applied to model evacuation time and evacuation routes. The results of this analysis provide important inputs to the disaster management authorities for future evacuation planning and disaster mitigation

    Atlas of the Human Planet 2017: Global Exposure to Natural Hazards

    Get PDF
    The Atlas of the Human Planet 2017. Global Exposure to Natural Hazards summarizes the global multi-temporal analysis of exposure to six major natural hazards: earthquakes, volcanoes, tsunamis, floods, tropical cyclone winds, and sea level surge. The exposure focuses on human settlements assessed through two variables: the global built-up and the global resident population. The two datasets are generated within the Global Human Settlement Project of the Joint Research Centre. They represent the core dataset of the Atlas of the Human Planet 2016 which provides empirical evidence on urbanization trends and dynamics. The figures presented in the Atlas 2017 show that exposure to natural hazards doubled in the last 40 years, both for built-up area and population. Earthquake is the hazard that accounts for the highest number of people potentially exposed. Flood, the most frequent natural disaster, potentially affects more people in Asia (76.9% of the global population exposed) and Africa (12.2%) than in other regions. Tropical cyclone winds threaten 89 countries in the world and the population exposed to cyclones increased from 1 billion in 1975 up to 1.6 billion in 2015. The country most at risk to tsunamis is Japan, whose population is 4 times more exposed than China, the second country on the ranking. Sea level surge affects the countries across the tropical region and China has one of the largest increase of population over the last four decades (plus 200 million people from 1990 to 2015). The figures presented in the Atlas are aggregate estimates at country level. The value of the GHSL layers used to generate the figures in this Atlas is that the data are available at fine scale and exposure and the rate of change in exposure can be computed for any area of the world. Researchers and policy makers are now allowed to aggregate exposure information at all geographical scale of analysis from the country level to the region, continent and global.JRC.E.1-Disaster Risk Managemen

    Scoring, selecting, and developing physical impact models for multi-hazard risk assessment

    Get PDF
    This study focuses on scoring, selecting, and developing physical fragility (i.e., the probability of reaching or exceeding a certain DS given a specific hazard intensity) and/or vulnerability (i.e., the probability of impact given a specific hazard intensity) models for assets, with particular emphasis on buildings. Given a set of multiple relevant hazards for a selected case-study region, the proposed procedure involves 1) mapping the relevant asset classes (i.e., construction types for a given occupancy) in the region to a set of existing candidate fragility, vulnerability and/or damage-to-impact models, also accounting for specific modelling requirements (e.g., time dependency due to ageing/deterioration of buildings, multi-hazard interactions); 2) scoring the candidate models according to relevant criteria to select the most suitable ones for a given application; or 3) using state-of-the-art numerical or empirical methods to develop fragility/vulnerability models not already available. The approach is demonstrated for the buildings of the virtual urban testbed “Tomorrowville”, considering earthquakes, floods, and debris flows as case-study hazards

    Application of open-access and 3rd party geospatial technology for integrated flood risk management in data sparse regions of developing countries

    Get PDF
    Floods are one of the most devastating disasters known to man, caused by both natural and anthropogenic factors. The trend of flood events is continuously rising, increasing the exposure of the vulnerable populace in both developed and especially developing regions. Floods occur unexpectedly in some circumstances with little or no warning, and in other cases, aggravate rapidly, thereby leaving little time to plan, respond and recover. As such, hydrological data is needed before, during and after the flooding to ensure effective and integrated flood management. Though hydrological data collection in developed countries has been somewhat well established over long periods, the situation is different in the developing world. Developing regions are plagued with challenges that include inadequate ground monitoring networks attributed to deteriorating infrastructure, organizational deficiencies, lack of technical capacity, location inaccessibility and the huge financial implication of data collection at local and transboundary scales. These limitations, therefore, result in flawed flood management decisions and aggravate exposure of the most vulnerable people. Nigeria, the case study for this thesis, experienced unprecedented flooding in 2012 that led to the displacement of 3,871,53 persons, destruction of infrastructure, disruption of socio-economic activities valued at 16.9 billion US Dollars (1.4% GDP) and sadly the loss of 363 lives. This flood event revealed the weakness in the nation’s flood management system, which has been linked to poor data availability. This flood event motivated this study, which aims to assess these data gaps and explore alternative data sources and approaches, with the hope of improving flood management and decision making upon recurrence. This study adopts an integrated approach that applies open-access geospatial technology to curb data and financial limitations that hinder effective flood management in developing regions, to enhance disaster preparedness, response and recovery where resources are limited. To estimate flood magnitudes and return periods needed for planning purposes, the gaps in hydrological data that contribute to poor estimates and consequently ineffective flood management decisions for the Niger-South River Basin of Nigeria were filled using Radar Altimetry (RA) and Multiple Imputation (MI) approaches. This reduced uncertainty associated with missing data, especially at locations where virtual altimetry stations exist. This study revealed that the size and consistency of the gap within hydrological time series significantly influences the imputation approach to be adopted. Flood estimates derived from data filled using both RA and MI approaches were similar for consecutive gaps (1-3 years) in the time series, while wide (inconsecutive) gaps (> 3 years) caused by gauging station discontinuity and damage benefited the most from the RA infilling approach. The 2012 flood event was also quantified as a 1-in-100year flood, suggesting that if flood management measures had been implemented based on this information, the impact of that event would have been considerably mitigated. Other than gaps within hydrological time series, in other cases hydrological data could be totally unavailable or limited in duration to enable satisfactory estimation of flood magnitudes and return periods, due to finance and logistical limitations in several developing and remote regions. In such cases, Regional Flood Frequency Analysis (RFFA) is recommended, to collate and leverage data from gauging stations in proximity to the area of interest. In this study, RFFA was implemented using the open-access International Centre for Integrated Water Resources Management–Regional Analysis of Frequency Tool (ICI-RAFT), which enables the inclusion of climate variability effect into flood frequency estimation at locations where the assumption of hydrological stationarity is not viable. The Madden-Julian Oscillation was identified as the dominant flood influencing climate mechanism, with its effect increasing with return period. Similar to other studies, climate variability inclusive regional flood estimates were less than those derived from direct techniques at various locations, and higher in others. Also, the maximum historical flood experienced in the region was less than the 1-in-100-year flood event recommended for flood management. The 2012 flood in the Niger-South river basin of Nigeria was recreated in the CAESAR-LISFLOOD hydrodynamic model, combining open-access and third-party Digital Elevation Model (DEM), altimetry, bathymetry, aerial photo and hydrological data. The model was calibrated/validated in three sub-domains against in situ water level, overflight photos, Synthetic Aperture Radar (SAR) (TerraSAR-X, Radarsat2, CosmoSkyMed) and optical (MODIS) satellite images where available, to access model performance for a range of geomorphological and data variability. Improved data availability within constricted river channel areas resulted in better inundation extent and water level reconstruction, with the F-statistic reducing from 0.808 to 0.187 downstream into the vegetation dominating delta where data unavailability is pronounced. Overflight photos helped improve the model to reality capture ratio in the vegetation dominated delta and highlighted the deficiencies in SAR data for delineating flooding in the delta. Furthermore, the 2012 flood was within the confine of a 1-in-100-year flood for the sub-domain with maximum data availability, suggesting that in retrospect the 2012 flood event could have been managed effectively if flood management plans were implemented based on a 1-in-100-year flood. During flooding, fast-paced response is required. However, logistical challenges can hinder access to remote areas to collect the necessary data needed to inform real-time decisions. Thus, this adopts an integrated approach that combines crowd-sourcing and MODIS flood maps for near-real-time monitoring during the peak flood season of 2015. The results highlighted the merits and demerits of both approaches, and demonstrate the need for an integrated approach that leverages the strength of both methods to enhance flood capture at macro and micro scales. Crowd-sourcing also provided an option for demographic and risk perception data collection, which was evaluated against a government risk perception map and revealed the weaknesses in the government flood models caused by sparse/coarse data application and model uncertainty. The C4.5 decision tree algorithm was applied to integrate multiple open-access geospatial data to improve SAR image flood detection efficiency and the outputs were further applied in flood model validation. This approach resulted in F-Statistic improvement from 0.187 to 0.365 and reduced the CAESAR-LISFLOOD model overall bias from 3.432 to 0.699. Coarse data resolution, vegetation density, obsolete/non-existent river bathymetry, wetlands, ponds, uncontrolled dredging and illegal sand mining, were identified as the factors that contribute to flood model and map uncertainties in the delta region, hence the low accuracy depicted, despite the improvements that were achieved. Managing floods requires the coordination of efforts before, during and after flooding to ensure optimal mitigation in the event of an occurrence. In this study, and integrated flood modelling and mapping approach is undertaken, combining multiple open-access data using freely available tools to curb the effects of data and resources deficiency on hydrological, hydrodynamic and inundation mapping processes and outcomes in developing countries. This approach if adopted and implemented on a large-scale would improve flood preparedness, response and recovery in data sparse regions and ensure floods are managed sustainably with limited resources
    corecore