40 research outputs found

    Comparative analysis of a new assessment of the seismic risk of residential buildings of two districts of Barcelona

    Get PDF
    There are personal and institutional decisions that can increase the seismic resilience of the buildings in a city. However, some of these decisions are possible if we have basic knowledge of buildings’ seismic risk. The present document describes the main results of a detailed study of seismic vulnerability and seismic risk of residential buildings of Ciutat Vella (the ancient district of Barcelona) and Nou Barris (one of the newest districts of Barcelona). In this study, we assessed seismic risk according to the Vulnerability Index Method-Probabilistic named as VIM_P. Moreover, we analyzed the influence of basic buildings’ features in the final vulnerability and seismic risk values. For instance, we assessed the seismic vulnerability and the seismic risk of groups of buildings defined according to the number of stories of the buildings. Findings of this research reveal that the annual frequency of exceedance of the collapse damage state in Ciutat Vella buildings is, on average, 4.7 times higher than for the buildings in Nou Barris. Moreover, according to the Best vulnerability curve, 70.31% and 2.81% of Ciutat Vella and Nou Barris buildings, respectively, have an annual frequency of exceedance of the collapse damage state greater than 1 × 10–5.The present research has received partial funding from the European Union’s Horizon 2020 research 826 and innovation program (grant agreement NÂș 823844, ChEESE CoE Project).Peer ReviewedPostprint (author's final draft

    Probabilistic seismic hazard and risk assessment in Spain

    Get PDF
    This monograph explains how probabilistic seismic risk assessments can be performed at different resolution levels using the same methodology, providing results in terms of the same metrics; but it also highlights what the differences in terms of inputs for the analysis are in the different cases. A country level assessment is first performed using a coarsegrain exposure database that includes only the building stock in the urban regions of Spain. A detailed urban seismic risk assessment is performed for Lorca, Murcia, Spain. In both cases, the fully probabilistic seismic risk results are expressed in terms of the loss exceedance curve, which is the main output, from where different probabilistic risk metrics, such as the average annual loss and the probable maximum loss can be derived. Because of the damage data availability for the Lorca, May 2011, earthquake, a comparison between the observed losses and those modelled using an earthquake scenario with similar characteristics in terms of location, magnitude and spectral accelerations was done for the building stock of the city. The results of the comparison are presented in terms of expected losses (in monetary terms) and damage levels related to the obtained mean damage ratios are compared with those observed by post-earthquake surveys. This monograph is an effort to explain, in a transparent and comprehensive way, through a step by step example, how risk can be calculated in probabilistic terms and what are the influences of the inputs in each of the stages and what the obtained results mean. After reviewing several available tools to estimate catastrophe risk by means of probabilistic approaches, the CAPRA Platform was chosen because its flexibility, compatibility with the assessments to be performed at different resolution levels and its open-source/freeware characteristics

    Developing and applying the concept of Value of Information to optimise data collection strategies for seismic hazard assessment

    Get PDF
    In seismic hazard assessments the importance of knowing different input parameters accurately depends on their weight within the hazard model. Many aspects of such assessments require inputs based on knowledge and data from experts. When it comes to decisions about data collection, facility owners and seismic hazard analysts need to balance the possible added value brought by acquiring new data against the budget and time available for its collection. In other words, they need to answer the question “Is it worth paying to obtain this information?” Assessing the value of information (VoI) before data collection should lead to optimising the time and money that one is willing to invest. This thesis presents a method that combines available data and expert judgment to facilitate the decision-making process within the site-response component of seismic hazard assessments. The approach integrates influence diagrams and decision trees to map the causal-relationships between input parameters in site-response analysis, and Bayesian inference to update the model when new evidence is considered. Here, the VoI is assessed for univariate, bivariate and multivariate uncertain parameters to infer an optimal seismic design for typical buildings and critical facilities. For the first time in the field of seismic hazard assessment and earthquake engineering, a framework is developed to integrate prior knowledge, ground investigation techniques characteristics and design safety requirements. The consistent findings across different applications show that VoI is highly sensitive to prior probabilities and to the accuracy of the test to be performed. This highlights the importance of defining those from available data as well as only considering tests that are suitable for our needs and budget. The developed VoI framework constitutes a useful decision-making tool for hazard analysts and facility owners, enabling not only the prioritisation of data collection for key input parameters and the identification of optimal tests, but also the justification of the associated decisions. This approach can enhance the accuracy and reliability of seismic hazard assessments, leading to more effective risk management strategies.In seismic hazard assessments the importance of knowing different input parameters accurately depends on their weight within the hazard model. Many aspects of such assessments require inputs based on knowledge and data from experts. When it comes to decisions about data collection, facility owners and seismic hazard analysts need to balance the possible added value brought by acquiring new data against the budget and time available for its collection. In other words, they need to answer the question “Is it worth paying to obtain this information?” Assessing the value of information (VoI) before data collection should lead to optimising the time and money that one is willing to invest. This thesis presents a method that combines available data and expert judgment to facilitate the decision-making process within the site-response component of seismic hazard assessments. The approach integrates influence diagrams and decision trees to map the causal-relationships between input parameters in site-response analysis, and Bayesian inference to update the model when new evidence is considered. Here, the VoI is assessed for univariate, bivariate and multivariate uncertain parameters to infer an optimal seismic design for typical buildings and critical facilities. For the first time in the field of seismic hazard assessment and earthquake engineering, a framework is developed to integrate prior knowledge, ground investigation techniques characteristics and design safety requirements. The consistent findings across different applications show that VoI is highly sensitive to prior probabilities and to the accuracy of the test to be performed. This highlights the importance of defining those from available data as well as only considering tests that are suitable for our needs and budget. The developed VoI framework constitutes a useful decision-making tool for hazard analysts and facility owners, enabling not only the prioritisation of data collection for key input parameters and the identification of optimal tests, but also the justification of the associated decisions. This approach can enhance the accuracy and reliability of seismic hazard assessments, leading to more effective risk management strategies

    Fault System-Based Probabilistic Seismic Hazard Assessment of a Moderate Seismicity Region: The Eastern Betics Shear Zone (SE Spain)

    Get PDF
    Including faults as seismogenic sources in probabilistic seismic hazard assessments (PSHA) has turned into a common practice as knowledge of active faults is improving. Moreover, the occurrence of earthquakes in multi-fault ruptures has evidenced the need to understand faults as interacting systems rather than independent sources. We present a PSHA for the Southeastern Spain obtained by including the faults of a moderate seismicity region, the Eastern Betics Shear Zone (EBSZ) in SE Spain, as the main seismogenic sources in two separate source models, one considering background seismicity. In contrast with previous studies in Spain, earthquake occurrence of the EBSZ system is modeled considering different hypotheses of multi-fault ruptures at the whole fault system scale and weighted in a logic tree. We compare the hazard levels with those from an area source PSHA and a previous fault-based approach. The results show a clear control of the EBSZ faults in the seismic hazard for all return periods, increasing drastically the hazard levels in the regions close to the fault traces and influencing up to 20 km farther with respect to the area source PSHA. The seismic hazard is dependent on the fault slip rates as peak ground accelerations and territorial extension of the fault influence appear higher around the Alhama de Murcia and Carboneras faults, while lower slip rate faults (Palomares Fault) show minor contribution to the hazard. For the return period of 475 years and near-fault locations, our models are more consistent with the ground motion values reached in the 2011 Mw 5.2 Lorca event than the building code or national seismic hazard map, which suggest that our fault system-based model performs more accurate estimations for this return period. Fault data, mainly slip rates, and its uncertainties have a clear impact on the seismic hazard and, for some faults, the lack of detailed paleoseismic studies can compromise the reliability of the hazard estimations. This, together with epistemic uncertainties concerning the background seismicity, are key discussion points in the present study, having an impact on further research and aiming to serve as a case example for other low-to-moderate seismicity regions worldwide

    Earthquake hazard and risk analysis for natural and induced seismicity: towards objective assessments in the face of uncertainty.

    Get PDF
    The fundamental objective of earthquake engineering is to protect lives and livelihoods through the reduction of seismic risk. Directly or indirectly, this generally requires quantification of the risk, for which quantification of the seismic hazard is required as a basic input. Over the last several decades, the practice of seismic hazard analysis has evolved enormously, firstly with the introduction of a rational framework for handling the apparent randomness in earthquake processes, which also enabled risk assessments to consider both the severity and likelihood of earthquake effects. The next major evolutionary step was the identification of epistemic uncertainties related to incomplete knowledge, and the formulation of frameworks for both their quantification and their incorporation into hazard assessments. Despite these advances in the practice of seismic hazard analysis, it is not uncommon for the acceptance of seismic hazard estimates to be hindered by invalid comparisons, resistance to new information that challenges prevailing views, and attachment to previous estimates of the hazard. The challenge of achieving impartial acceptance of seismic hazard and risk estimates becomes even more acute in the case of earthquakes attributed to human activities. A more rational evaluation of seismic hazard and risk due to induced earthquakes may be facilitated by adopting, with appropriate adaptations, the advances in risk quantification and risk mitigation developed for natural seismicity. While such practices may provide an impartial starting point for decision making regarding risk mitigation measures, the most promising avenue to achieve broad societal acceptance of the risks associated with induced earthquakes is through effective regulation, which needs to be transparent, independent, and informed by risk considerations based on both sound seismological science and reliable earthquake engineering

    Developing a global risk engine

    Get PDF
    Risk analysis is a critical link in the reduction of casualties and damages due to earthquakes. Recognition of this relation has led to a rapid rise in demand for accurate, reliable and flexible risk assessment software. However, there is a significant disparity between the high quality scientific data developed by researchers and the availability of versatile, open and user-friendly risk analysis tools to meet the demands of end-users. In the past few years several open-source software have been developed that play an important role in the seismic research, such as OpenSHA and OpenSEES. There is however still a gap when it comes to open-source risk assessment tools and software. In order to fill this gap, the Global Earthquake Model (GEM) has been created. GEM is an internationally sanctioned program initiated by the OECD that aims to build independent, open standards to calculate and communicate earthquake risk around the world. This initiative started with a one-year pilot project named GEM1, during which an evaluation of a number of existing risk software was carried out. After a critical review of the results it was concluded that none of the software were adequate for GEM requirements and therefore, a new object-oriented tool was to be developed. This paper presents a summary of some of the most well known applications used in risk analysis, highlighting the main aspects that were considered for the development of this risk platform. The research that was carried out in order to gather all of the necessary information to build this tool was distributed in four different areas: information technology approach, seismic hazard resources, vulnerability assessment methodologies and sources of exposure data. The main aspects and findings for each of these areas will be presented as well as how these features were incorporated in the up-to-date risk engine. Currently, the risk engine is capable of predicting human or economical losses worldwide considering both deterministic and probabilistic-based events, using vulnerability curves. A first version of GEM will become available at the end of 2013. Until then the risk engine will continue to be developed by a growing community of developers, using a dedicated open-source platform

    Compatibility of Seismic Hazard and Risk Calculations with Historical Observations

    Get PDF
    Diese Dissertation befasst sich mit der Bewertung der Modellierung der ErdbebengefĂ€hrdung und der SchĂ€den durch Vergleich mit historischen Erdbeben und Schadensinformationen. Ina Cecic, PrĂ€sidentin der European Seismological Commission, bemerkt dazu: „Ich freue mich zu bemerken, dass wir die AktivitĂ€ten fĂŒr interdisziplinĂ€re Aufgaben immer weiter ausdehnen und ĂŒber die „reine“ Seismologie hinaus auf andere Bereiche vordringen, die im wirklichen Leben miteinander verbunden sind. Dies ist besonders wichtig fĂŒr die Bereiche Bildung und Öffentlichkeitsarbeit, da nur das Bewusstsein fĂŒr die Gefahren in unserer Umgebung und das Wissen darĂŒber, was zu tun ist, langfristig Leben retten kann.“ In dieser Arbeit wird zunĂ€chst ein systematischer Rahmen fĂŒr die flĂ€chenbezogene Bewertung der QualitĂ€t der seismischen Gefahrenkarten entwickelt. Seit vielen Jahrzehnten ist die ErdbebengefĂ€hrdunskarte in den meisten LĂ€ndern ein zwingender Bestandteil der Konstruktionspraxis. Anstrengungen zur Beurteilung der tatsĂ€chlichen QualitĂ€t dieser Karten sind essentiell. Unterschiedliche Metriken und Kriterien werden angewendet und detailliert diskutiert. Ich untersuche diese Frage am Beispiel des Einflußes der Magnituden-HĂ€ufigkeit. Zweitens wurden in den letzten vier Jahrzehnten zahlreiche Forschungsberichte und -dokumente veröffentlicht, die sich mit der AnfĂ€lligkeit von GebĂ€uden fĂŒr Bodenbewegungen aufgrund von Erdbeben in China befassen, da eine umfassende Bewertung der seismischen AnfĂ€lligkeit von GebĂ€uden eine SchlĂŒsselaufgabe der Erdbebensicherheits- und Schadensbewertung ist. Aus diesem Grund habe ich zuerst 69 Artikel und Dissertationen unter die Lupe genommen und untersuchte die GebĂ€udeschĂ€den, die durch Erbeben in dicht besiedelten Gebieten entstanden sind. Sie stellen Beobachtungen dar, bei denen die makroseismischen IntensitĂ€ten gemĂ€ĂŸ der chinesischen offiziellen Seismic Intensity Scale bestimmt wurden. Aus diesen vielen Studien werden die mittleren FragilitĂ€tsfunktionen (abhĂ€ngig von der Makro-seismischen IntensitĂ€t) fĂŒr vier SchadensgrenzzustĂ€nde von zwei am weitesten verbreiteten GebĂ€udetypen abgeleitet: Mauerwerk und Stahlbeton. Ich habe auch 18 Veröffentlichungen untersucht, die analytische FragilitĂ€tsfunktionen (abhĂ€ngig von der Spitzenbeschleunigung - PGA) fĂŒr dieselben Schadensklassen und GebĂ€udekategorien bereitstellen. Auf diese Weise wird eine solide FragilitĂ€tsdatenbank fĂŒr seismisch gefĂ€hrdete Gebiete auf dem chinesischen Festland erstellt, die sowohl auf IntensitĂ€t als auch auf PGA basiert. Es wird ein umfassender Überblick ĂŒber die Probleme bei der Bewertung der FragilitĂ€t fĂŒr verschiedene GebĂ€udetypen gegeben. Ein notwendiger Vergleich mit internationalen Projekten mit Ă€hnlichem Schwerpunkt wird durchgefĂŒhrt. Basierend auf der neu gesammelten FragilitĂ€tsdatenbank wird ein neuer Ansatz zur Ableitung der IntensitĂ€t-PGA-Beziehung unter Verwendung der FragilitĂ€t als BrĂŒcke vorgeschlagen, und es werden optimierte IntensitĂ€t-PGA-Beziehungen entwickelt. Dieser Ansatz fĂŒhrt zur Verringerung der Streuung in der traditionellen IntensitĂ€ts-PGA-Beziehung. Drittens, fĂŒr die Risikoanalyse wird der GebĂ€udebestand, der durch ein Erdbeben gefĂ€hrdet ist benötigt. Diese Studie entwickelt einen Ansatz einesgeo-kodierten Bestandsmodells fĂŒr WohngebĂ€ude fĂŒr das chinesische Festland durch. Hierbei werden die Daten der VolkszĂ€hlung in einem 1 km × 1 km Rahmen als Proxy benutzt. Zur Bewertung der Modellleistung wird die in diesem Kapitel entwickelte WohnflĂ€che auf Bezirksebene mit Aufzeichnungen aus dem statistischen Jahrbuch verglichen. Es zeigt sich, dass die in dieser Studie entwickelte GrundflĂ€che nach Bereinigung um einheitliche Baukosten durchaus mit der GrundflĂ€chenstatistik von Shanghai vergleichbar ist. Die Anwendung dieses Modells in der Risikoanalyse des Erdbebens in Wenchuan M8.0 wird ebenfalls durchgefĂŒhrt. Der auf der Grundlage dieses Expositionsmodells fĂŒr WohngebĂ€ude geschĂ€tzte Gesamtverlust entspricht in etwa dem Verlustwert, der aus Schadensmeldungen auf der Grundlage von Felduntersuchungen abgeleitet wurde. Diese Kongruenzen verdeutlichen die Robustheit des hier entwickelten Wohnungsbestandmodells. Schließlich wird die SchadensabschĂ€tzung mithilfe verschiedener Methoden durchgefĂŒhrt und ein Vergleich mit SchĂ€den angestellt, die aus Schadensmeldungen abgeleitet wurden. Zur Verbesserung des Verlustverteilungsmusters wird die Entwicklung eines regionalen Human Development Indexes vorgeschlagen. SensitivitĂ€tstests werden durchgefĂŒhrt, um die Auswirkungen jedes Faktors auf die Ermittlung des endgĂŒltigen Verlusts zu ĂŒberprĂŒfen

    The making of the NEAM Tsunami Hazard Model 2018 (NEAMTHM18)

    Get PDF
    The NEAM Tsunami Hazard Model 2018 (NEAMTHM18) is a probabilistic hazard model for tsunamis generated by earthquakes. It covers the coastlines of the North-eastern Atlantic, the Mediterranean, and connected seas (NEAM). NEAMTHM18 was designed as a three-phase project. The first two phases were dedicated to the model development and hazard calculations, following a formalized decision-making process based on a multiple-expert protocol. The third phase was dedicated to documentation and dissemination. The hazard assessment workflow was structured in Steps and Levels. There are four Steps: Step-1) probabilistic earthquake model; Step-2) tsunami generation and modeling in deep water; Step-3) shoaling and inundation; Step-4) hazard aggregation and uncertainty quantification. Each Step includes a different number of Levels. Level-0 always describes the input data; the other Levels describe the intermediate results needed to proceed from one Step to another. Alternative datasets and models were considered in the implementation. The epistemic hazard uncertainty was quantified through an ensemble modeling technique accounting for alternative models' weights and yielding a distribution of hazard curves represented by the mean and various percentiles. Hazard curves were calculated at 2,343 Points of Interest (POI) distributed at an average spacing of ∌20 km. Precalculated probability maps for five maximum inundation heights (MIH) and hazard intensity maps for five average return periods (ARP) were produced from hazard curves. In the entire NEAM Region, MIHs of several meters are rare but not impossible. Considering a 2% probability of exceedance in 50 years (ARP≈2,475 years), the POIs with MIH >5 m are fewer than 1% and are all in the Mediterranean on Libya, Egypt, Cyprus, and Greece coasts. In the North-East Atlantic, POIs with MIH >3 m are on the coasts of Mauritania and Gulf of Cadiz. Overall, 30% of the POIs have MIH >1 m. NEAMTHM18 results and documentation are available through the TSUMAPS-NEAM project website (http://www.tsumaps-neam.eu/), featuring an interactive web mapper. Although the NEAMTHM18 cannot substitute in-depth analyses at local scales, it represents the first action to start local and more detailed hazard and risk assessments and contributes to designing evacuation maps for tsunami early warning
    corecore