26 research outputs found

    Scenario-based Tsunami hazard assessment for Northeastern Adriatic coasts

    Full text link
    Significant tsunamis in Northern Adriatic are rare and only a few historical events were reported in the literature, with sources mostly located along with central and southern parts of the Adriatic coasts. Recently, a tsunami alert system has been established for the whole Mediterranean area; however, a detailed description of the potential impact of tsunami waves on coastal areas is still missing for several sites. This study aims at modelling the hazard associated with possible tsunamis, generated by offshore earthquakes, with the purpose of contributing to tsunami risk assessment for selected urban areas located along the Northeastern Adriatic coasts. Tsunami modelling is performed by the NAMI DANCE software, which allows accounting for seismic source properties, variable bathymetry, and non-linear effects in waves propagation. Preliminary hazard scenarios at the shoreline are developed for the coastal areas of Northeastern Italy and at selected cities (namely Trieste, Monfalcone, Lignano and Grado). A wide set of potential tsunamigenic sources of tectonic origin, located in three distance ranges (namely at Adriatic-wide, regional and local scales), are considered for the modelling; sources are defined according to available literature, which includes catalogues of historical tsunami and existing active faults databases. Accordingly, a preliminary set of tsunami-related parameters and maps are obtained (e.g. maximum run-up, arrival times, synthetic mareograms), relevant towards planning mitigation actions at the selected sites.Comment: 31 pages, 15 figures and 4 table

    The contribution of pattern recognition of seismic and morphostructural data to seismic hazard assessment

    Full text link
    The reliable statistical characterization of the spatial and temporal properties of large earthquakes occurrence is one of the most debated issues in seismic hazard assessment, due to the unavoidably limited observations from past events. We show that pattern recognition techniques, which are designed in a formal and testable way, may provide significant space-time constraints about impending strong earthquakes. This information, when combined with physically sound methods for ground shaking computation, like the neo-deterministic approach (NDSHA), may produce effectively preventive seismic hazard maps. Pattern recognition analysis of morphostructural data provide quantitative and systematic criteria for identifying the areas prone to the largest events, taking into account a wide set of possible geophysical and geological data, whilst the formal identification of precursory seismicity patterns (by means of CN and M8S algorithms), duly validated by prospective testing, provides useful constraints about impending strong earthquakes at the intermediate space-time scale. According to a multi-scale approach, the information about the areas where a strong earthquake is likely to occur can be effectively integrated with different observations (e.g. geodetic and satellite data), including regional scale modeling of the stress field variations and of the seismic ground shaking, so as to identify a set of priority areas for detailed investigations of short-term precursors at local scale and for microzonation studies. Results from the pattern recognition of earthquake prone areas (M>=5.0) in the Po plain (Northern Italy), as well as from prospective testing and validation of the time-dependent NDSHA scenarios are presented.Comment: 33 pages, 7 Figures, 9 Tables. Submitted to Bollettino di Geofisica Teorica e Applicata (BGTA

    Crowd-Sourced Buildings Data Collection and Remote Training: New Opportunities to Engage Students in Seismic Risk Reduction

    Get PDF
    Young generations are increasingly committed to understanding disasters, and are a key player in current and future disaster risk reduction activities. The availability of online tools opened new perspectives in the organization of risk-related educational activities, in particular in earthquake-prone areas. This is the case of CEDAS (building CEnsus for seismic Damage Assessment), a pilot training activity aimed at collecting risk-related information while educating high-school students about seismic risk. During this experimental activity, students collected and elaborated crowdsourced data on the main building typologies in the proximity of their homes. In a few months, students created a dataset of valuable risk-related information, while getting familiar with the area where they live. Data collection was performed both on-site, using smartphones, and online, based on remote sensing images provided by multiple sources (e.g., Google maps and street view). This allowed all students, including those with limited mobility, to perform the activity. The CEDAS experience pointed out the potential of online tools and remote sensing images, combined with practical activities and basic training in exploratory data analysis, to engage students in an inclusive way. The proposed approach can be naturally expanded in a multi-risk perspective, and can be adjusted, eventually increasing the technical content of collected information, to the specific training and expertise of the involved students, from high-school to university level

    CLIMATIC MODULATION OF SEISMICITY IN THE ALPINE-HIMALAYAN MOUNTAIN RANGE

    Get PDF
    Abstract The influence of strain field variations associated with seasonal and longer term climatic phenomena on earthquake occurrence is investigated. Two regions (Himalaya and Alps), characterized by present day mountain building and relevant glaciers retreat, as well as by sufficiently long earthquake catalogues, are suitable for the analysis. Secular variations of permanent glaciers dimensions, which are naturally grossly correlated with long-term average surface atmosphere temperature changes, as well as seasonal snow load, cause crustal deformations that modulate seismicity. MIRAMARE -TRIESTE April 2009 2 Introduction Tectonic forces responsible for mountain building must overcome, among others, gravity force

    Productivity within the ETAS seismicity model

    Full text link
    The productivity of a magnitude mm event can be characterized in term of triggered events of magnitude above m−Δm-\Delta: it is the number of direct "descendants" νΔ\nu_\Delta and the number of all "descendants" VΔV_\Delta. There is evidence in favour of the discrete exponential distribution for both νΔ\nu_\Delta and VΔV_\Delta with a dominant magnitude mm (the case of aftershock cluster). We consider the general Epidemic Type Aftershock Sequence (ETAS) model adapted to any distribution of νΔ\nu_\Delta. It turns out that the branching structure of the model excludes the possibility of having exponential distributions for both productivity characteristics at once. We have analytically investigated the features of the VΔV_\Delta distribution within a wide class of ETAS models. We show the fundamental difference in tail behavior of the VΔV_\Delta-distributions for general-type clusters and clusters with a dominant initial magnitude: the tail is heavy in the former case and light in the latter. The real data demonstrate the possibilities of this kind. This result provides theoretical and practical constraints for distributional analysis of VΔV_\Delta.Comment: Corresponding author: George Molchan (email: [email protected]), 22 pages, 2 figures, submitted to Geophysical Journal Internationa

    Operational earthquake forecast/prediction

    No full text
    The operational and decision-making problems related to earthquake forecast/prediction and seismic hazard assessment are nowadays a matter of significant debate, particularly on account of the very unsatisfactory global performance of Probabilistic Seismic Hazard Assessment at the occurence of most of the recent destructive earthquakes. While it is recognized that opera- tional tools must demonstrate their capability in anticipating large earthquakes and the related ground shaking by rigor- ous verification and validation process, only few methods proved effective so far. In view of the inherent uncertainties in predicting predictable, the usefulness of any forecast/ prediction method can then be judged taking into account the wide range of possible mitigation actions of different levels (from increased preparedness to evacuation). This paper aims to open a debate and complement the Summary and Recommendations by the International Commission on Earthquake Forecasting, established after the earthquake in L\u2019Aquila (M = 6.3, 6 April 2009). The issues related with the definition, validation, and possible use of forecasting/ prediction methodologies, are addressed with special emphasis on existing operational practice in Italy

    Multi-Scenario Based Assessment of Seismic Hazard: a Must for the Effective Definition of the Seismic Input

    No full text
    Lessons learnt from the destructive earthquakes occurred during the new millennium provide new opportunities to take action, revise and improve the procedure for seismic hazard assessment. When dealing with cultural heritage and critical structures (e.g. nuclear power plants), where it is necessary to consider extremely long time intervals, the standard hazard estimates are by far unsuitable, due to their basic heuristic limitations. A viable alternative to traditional seismic hazard assessment is represented by the use of the scenario earthquakes, characterized at least in terms of magnitude, distance and faulting style, and by the treatment of complex source processes. The scenario-based methods (NDSHA) for seismic hazard analysis, where realistic and duly validated synthetic time series, accounting for source, propagation and site effects, are used to construct earthquake scenarios. The NDSHA procedure provides strong ground motion parameters based on the seismic waves propagation modeling at different scales accounting for a wide set of possible seismic sources and for the available information about structural models. Actually, the proposed method can be applied at regional (national) scale, computing seismograms at the nodes of a grid with the desired spacing, also integrated with time dependent scenarios, or at local (metropolitan) scale, taking into account detailed source characteristics, the path and local geological and geotechnical conditions by means of 3D laterally heterogeneous anelastic models. The relevance of the realistic modeling, which permits the generalization of empirical observations by means of physically sound theoretical considerations, is evident, as it allows for the optimization of the structural design with respect to the site of interest

    Why are the Standard Probabilistic Methods of Estimating Seismic Hazard and Risks Too Often Wrong.

    No full text
    According to the probabilistic seismic hazard analysis (PSHA) approach, the deter- ministically evaluated or historically defined largest credible earthquakes (often referred to as Maximum Credible Earthquakes, MCEs) are \u201can unconvincing possi- bility\u201d and are treated as \u201clikely impossibilities\u201d within individual seismic zones. However, globally over the last decade such events keep occurring where PSHA pre- dicted seismic hazard to be low. Systematic comparison of the observed ground shaking with the expected one reported by the Global Seismic Hazard Assessment Program (GSHAP) maps discloses gross underestimation worldwide. Several inconsistencies with available observation are found also for national scale PSHA maps (including Italy), developed using updated data sets. As a result, the expected numbers of fatalities in recent disastrous earthquakes have been underestimated by these maps by approx- imately two to three orders of magnitude. The total death toll in 2000e2011 (which exceeds 700,000 people, including tsunami victims) calls for a critical reappraisal of GSHAP results, as well as of the underlying methods. In this chapter, we discuss the limits in the formulation and use of PSHA, addressing some theoretical and practical issues of seismic hazard assessment, which range from the overly simplified assumption that one could reduce the tensor problem of seismic- wave generation and propagation into a scalar problem (as implied by ground motion prediction equations), to the insufficient size and quality of earthquake catalogs for a reliable probability modeling at the local scale. Specific case studies are discussed, which may help to better understand the practical relevance of the mentioned issues. The aim is to present a critical overview of different approaches, analyses, and ob- servations in order to provide the readers with some general considerations and constructive ideas toward improved seismic hazard and effective risk assessment. Specifically, we show that seismic hazard analysis based on credible scenarios for real earthquakes, defined as neo-deterministic seismic hazard analysis, provides a robust alternative approach for seismic hazard and risk assessment. Therefore, it should be extensively tested as a suitable method for formulating scientifically sound and realistic public policy and building code practices

    Seismic Hazard Scenarios as Preventive Tools for a Disaster Resilient Society

    No full text
    Lessons learnt from the destructive earthquakes occurred during the new millennium provide new opportunities to take action, revise, and improve the procedure for seismic hazard assessment (SHA). A single hazard map cannot meet the requirements from different end-users; the mapping of the expected earthquake ground motion that accounts for events\u2019 recurrence may be suitable for insurances. When dealing with cultural heritage and critical structures (e.g., nuclear power plants), where it is necessary to consider extremely long time intervals, the standard hazard estimates are by far unsuitable, due to their basic heuristic limitations. While time-dependent SHA may be suitable to increase earthquake preparedness, by planning adequate mitigation actions, for critical structures (i.e., those for which the consequences of failure are intolerable) the maximum possible seismic input is relevant. Therefore the need for an appropriate estimate of the seismic hazard, aimed not only at the seismic classification of the national territory, but also at the capability of properly accounting for the local ampli\u2423cations of ground shaking, as well as for the fault properties, is a pressing concern for seismic engineers. A viable alternative to traditional SHA is represented by the use of the scenario earthquakes, characterized at least in terms of magnitude, distance, and faulting style, and by the treatment of complex source processes. The relevance of the realistic modeling, which permits the generalization of empirical observations by means of physically sound theoretical considerations, is evident, as it allows the optimization of the structural design with respect to the site of interest. The time information associated with the scenarios of ground motion, given by the intermediate-term middle-range earthquake predictions, can be useful to public authorities in assigning priorities for timely mitigation actions. Therefore, the approach we have developed naturally supplies realistic time series of ground motion useful to preserve urban settings, historical monuments, and relevant man-made structures

    A seismic quiescence before the 2017 Mw 7.3 Sarpol Zahab (Iran) earthquake: Detection and analysis by improved RTL method

    No full text
    A major earthquake, with magnitude Mw 7.3, struck Sarpol Zahab (Kermanshah province, Iran) on November 12, 2017, causing extended damage and casualties. The epicenter was located in the Northwestern part of the Zagros mountain range, an active belt originated by the Arabia-Eurasia collision. We explore seismicity preceding this earthquake, by using the Iranian Seismological Center instrumental earthquake catalog (IGTU), with the aim to identify possible anomalies in background seismicity that can be related with this and other future large events. For this purpose, we used a method for intermediate term forecasts of large earthquakes, namely the Region Time Length (RTL) algorithm, which analyzes declustered catalogs and is sensitive to quiescences that may precede major earthquakes. RTL has been progressively refined and has been applied in several regions worldwide during the last decades. To decluster the earthquake catalog we used a quite novel approach, based on the nearest-neigbour distances between events in the space-time-energy domain, a method that preserves the background seismicity while removing the clustered component. The retrospective application of RTL algorithm to the area surrounding the mainshock epicenter highlights two significant quiescences: one preceding the Sarpol Zahab Mw 7.3 earthquake, and the other occurring before a Mw 5.7 earthquake, which struck the same region on November 2013. The quiescences duration ranges from few months to one year and is compatible with earlier results from different regions of the world. In addition, we applied an enhanced variant of RTL algorithm, which allows us drawing maps for the whole study region and that shows only quiescences consistently detected for different choices of the free parameters, and hence more stable. The resulting map for Northwestern Iran, calculated for the time span 1 June 2017–11 November 2017, evidences two broad quiescence regions, oriented NW-SE along the Zagros belt. One, located to the north, evidences a significant seismic anomaly corresponding to the Sarpol Zahab earthquake, which disappeares immediately after the event. The second one, located in the southeastern part of the study region, persists up to the end of the available catalog (October 4, 2018).Published10-196T. Studi di pericolosità sismica e da maremoto7T. Variazioni delle caratteristiche crostali e precursori sismiciJCR Journa
    corecore