31 research outputs found

    Scenario-based Tsunami hazard assessment for Northeastern Adriatic coasts

    Full text link
    Significant tsunamis in Northern Adriatic are rare and only a few historical events were reported in the literature, with sources mostly located along with central and southern parts of the Adriatic coasts. Recently, a tsunami alert system has been established for the whole Mediterranean area; however, a detailed description of the potential impact of tsunami waves on coastal areas is still missing for several sites. This study aims at modelling the hazard associated with possible tsunamis, generated by offshore earthquakes, with the purpose of contributing to tsunami risk assessment for selected urban areas located along the Northeastern Adriatic coasts. Tsunami modelling is performed by the NAMI DANCE software, which allows accounting for seismic source properties, variable bathymetry, and non-linear effects in waves propagation. Preliminary hazard scenarios at the shoreline are developed for the coastal areas of Northeastern Italy and at selected cities (namely Trieste, Monfalcone, Lignano and Grado). A wide set of potential tsunamigenic sources of tectonic origin, located in three distance ranges (namely at Adriatic-wide, regional and local scales), are considered for the modelling; sources are defined according to available literature, which includes catalogues of historical tsunami and existing active faults databases. Accordingly, a preliminary set of tsunami-related parameters and maps are obtained (e.g. maximum run-up, arrival times, synthetic mareograms), relevant towards planning mitigation actions at the selected sites.Comment: 31 pages, 15 figures and 4 table

    Unified Scaling Law for Earthquakes: Space-Time Dependent Assessment in Friuli-Venezia Giulia Region

    Get PDF
    The concept of the Unified Scaling Law for Earthquakes (USLE), which generalizes the Gutenberg-Richter relationship making use of the fractal distribution of earthquake sources in a seismic region, is applied to seismicity in the Friuli-Venezia Giulia region, FVG (Northeastern Italy) and its surroundings. In particular, the temporal variations of USLE coefficients are investigated, with the aim to get new insights in the evolving dynamics of seismicity within different tectonic domains of FVG. To this purpose, we consider all magnitude 2.0 or larger earthquakes that occurred in 1995–2019, as reported in the catalog compiled at the National Institute of Oceanography and Applied Geophysics (OGS catalog), within the territory of its homogeneous completeness. The observed variability of seismic dynamics for three sub-regions of the territory under investigation, delimited based on main geological and tectonic features, is characterized in terms of several moving averages, including: the inter-event time,τ; the cumulative Benioff strain release, Ʃ; the USLE coefficients estimated for moving six-years time intervals, and the USLE control parameter,η. We found that: 1) the USLE coefficients in FVG region are time-dependent and show up correlated; 2) the dynamical changes ofτ, Ʃ, andηin the three sub-regions highlight a number of different seismic regimes; 3) seismic dynamics, prior and after the occurrence of the 1998 and 2004 Kobarid (Slovenia) strong main shocks, is characterized by different parameters in the related sub-region. The results obtained for the FVG region confirm similar analysis performed on a global scale, in advance and after the largest earthquakes worldwide. Moreover, our analysis highlights the spatially heterogeneous and non-stationary features of seismicity in the investigated territory, thus suggesting the opportunity of resorting to time-dependent estimates for improving local seismic hazard assessment. The applied methods and obtained parameters provide quantitative basis for developing suitable models and forecasting tools, toward a better characterization of future seismic hazard in the region

    The contribution of pattern recognition of seismic and morphostructural data to seismic hazard assessment

    Full text link
    The reliable statistical characterization of the spatial and temporal properties of large earthquakes occurrence is one of the most debated issues in seismic hazard assessment, due to the unavoidably limited observations from past events. We show that pattern recognition techniques, which are designed in a formal and testable way, may provide significant space-time constraints about impending strong earthquakes. This information, when combined with physically sound methods for ground shaking computation, like the neo-deterministic approach (NDSHA), may produce effectively preventive seismic hazard maps. Pattern recognition analysis of morphostructural data provide quantitative and systematic criteria for identifying the areas prone to the largest events, taking into account a wide set of possible geophysical and geological data, whilst the formal identification of precursory seismicity patterns (by means of CN and M8S algorithms), duly validated by prospective testing, provides useful constraints about impending strong earthquakes at the intermediate space-time scale. According to a multi-scale approach, the information about the areas where a strong earthquake is likely to occur can be effectively integrated with different observations (e.g. geodetic and satellite data), including regional scale modeling of the stress field variations and of the seismic ground shaking, so as to identify a set of priority areas for detailed investigations of short-term precursors at local scale and for microzonation studies. Results from the pattern recognition of earthquake prone areas (M>=5.0) in the Po plain (Northern Italy), as well as from prospective testing and validation of the time-dependent NDSHA scenarios are presented.Comment: 33 pages, 7 Figures, 9 Tables. Submitted to Bollettino di Geofisica Teorica e Applicata (BGTA

    Crowd-Sourced Buildings Data Collection and Remote Training: New Opportunities to Engage Students in Seismic Risk Reduction

    Get PDF
    Young generations are increasingly committed to understanding disasters, and are a key player in current and future disaster risk reduction activities. The availability of online tools opened new perspectives in the organization of risk-related educational activities, in particular in earthquake-prone areas. This is the case of CEDAS (building CEnsus for seismic Damage Assessment), a pilot training activity aimed at collecting risk-related information while educating high-school students about seismic risk. During this experimental activity, students collected and elaborated crowdsourced data on the main building typologies in the proximity of their homes. In a few months, students created a dataset of valuable risk-related information, while getting familiar with the area where they live. Data collection was performed both on-site, using smartphones, and online, based on remote sensing images provided by multiple sources (e.g., Google maps and street view). This allowed all students, including those with limited mobility, to perform the activity. The CEDAS experience pointed out the potential of online tools and remote sensing images, combined with practical activities and basic training in exploratory data analysis, to engage students in an inclusive way. The proposed approach can be naturally expanded in a multi-risk perspective, and can be adjusted, eventually increasing the technical content of collected information, to the specific training and expertise of the involved students, from high-school to university level

    CLIMATIC MODULATION OF SEISMICITY IN THE ALPINE-HIMALAYAN MOUNTAIN RANGE

    Get PDF
    Abstract The influence of strain field variations associated with seasonal and longer term climatic phenomena on earthquake occurrence is investigated. Two regions (Himalaya and Alps), characterized by present day mountain building and relevant glaciers retreat, as well as by sufficiently long earthquake catalogues, are suitable for the analysis. Secular variations of permanent glaciers dimensions, which are naturally grossly correlated with long-term average surface atmosphere temperature changes, as well as seasonal snow load, cause crustal deformations that modulate seismicity. MIRAMARE -TRIESTE April 2009 2 Introduction Tectonic forces responsible for mountain building must overcome, among others, gravity force

    Reality Check: Seismic Hazard Models You Can Trust

    Get PDF
    3Seismic safety assessments of and improvements can hardly rely on the Seismic Hazard Harmonization in Europe (SHARE) project calculations, at least in Romania and Italy. We urge the necessary revision of widespread probabilistic seismic hazard analysis (PSHA) maps. Physically sound deterministic methods would enable the maximal magnitude of an expected earthquake for seismically hazardous areas to be estimated with a statistically justifiable reliability. Deterministic scenarios of catastrophic earthquakes may provide a comprehensive basis for decision making for land use planning, adjusting building codes, regulations, and operational emergency management.openembargoed_20151225Kossobokov, V.G.; Peresan, A.; Panza, G. .Kossobokov, V. G.; Peresan, Antonella; Panza, Giulian

    Productivity within the ETAS seismicity model

    Full text link
    The productivity of a magnitude mm event can be characterized in term of triggered events of magnitude above mΔm-\Delta: it is the number of direct "descendants" νΔ\nu_\Delta and the number of all "descendants" VΔV_\Delta. There is evidence in favour of the discrete exponential distribution for both νΔ\nu_\Delta and VΔV_\Delta with a dominant magnitude mm (the case of aftershock cluster). We consider the general Epidemic Type Aftershock Sequence (ETAS) model adapted to any distribution of νΔ\nu_\Delta. It turns out that the branching structure of the model excludes the possibility of having exponential distributions for both productivity characteristics at once. We have analytically investigated the features of the VΔV_\Delta distribution within a wide class of ETAS models. We show the fundamental difference in tail behavior of the VΔV_\Delta-distributions for general-type clusters and clusters with a dominant initial magnitude: the tail is heavy in the former case and light in the latter. The real data demonstrate the possibilities of this kind. This result provides theoretical and practical constraints for distributional analysis of VΔV_\Delta.Comment: Corresponding author: George Molchan (email: [email protected]), 22 pages, 2 figures, submitted to Geophysical Journal Internationa

    Operational earthquake forecast/prediction

    No full text
    The operational and decision-making problems related to earthquake forecast/prediction and seismic hazard assessment are nowadays a matter of significant debate, particularly on account of the very unsatisfactory global performance of Probabilistic Seismic Hazard Assessment at the occurence of most of the recent destructive earthquakes. While it is recognized that opera- tional tools must demonstrate their capability in anticipating large earthquakes and the related ground shaking by rigor- ous verification and validation process, only few methods proved effective so far. In view of the inherent uncertainties in predicting predictable, the usefulness of any forecast/ prediction method can then be judged taking into account the wide range of possible mitigation actions of different levels (from increased preparedness to evacuation). This paper aims to open a debate and complement the Summary and Recommendations by the International Commission on Earthquake Forecasting, established after the earthquake in L\u2019Aquila (M = 6.3, 6 April 2009). The issues related with the definition, validation, and possible use of forecasting/ prediction methodologies, are addressed with special emphasis on existing operational practice in Italy

    Multi-Scenario Based Assessment of Seismic Hazard: a Must for the Effective Definition of the Seismic Input

    No full text
    Lessons learnt from the destructive earthquakes occurred during the new millennium provide new opportunities to take action, revise and improve the procedure for seismic hazard assessment. When dealing with cultural heritage and critical structures (e.g. nuclear power plants), where it is necessary to consider extremely long time intervals, the standard hazard estimates are by far unsuitable, due to their basic heuristic limitations. A viable alternative to traditional seismic hazard assessment is represented by the use of the scenario earthquakes, characterized at least in terms of magnitude, distance and faulting style, and by the treatment of complex source processes. The scenario-based methods (NDSHA) for seismic hazard analysis, where realistic and duly validated synthetic time series, accounting for source, propagation and site effects, are used to construct earthquake scenarios. The NDSHA procedure provides strong ground motion parameters based on the seismic waves propagation modeling at different scales accounting for a wide set of possible seismic sources and for the available information about structural models. Actually, the proposed method can be applied at regional (national) scale, computing seismograms at the nodes of a grid with the desired spacing, also integrated with time dependent scenarios, or at local (metropolitan) scale, taking into account detailed source characteristics, the path and local geological and geotechnical conditions by means of 3D laterally heterogeneous anelastic models. The relevance of the realistic modeling, which permits the generalization of empirical observations by means of physically sound theoretical considerations, is evident, as it allows for the optimization of the structural design with respect to the site of interest

    Why are the Standard Probabilistic Methods of Estimating Seismic Hazard and Risks Too Often Wrong.

    No full text
    According to the probabilistic seismic hazard analysis (PSHA) approach, the deter- ministically evaluated or historically defined largest credible earthquakes (often referred to as Maximum Credible Earthquakes, MCEs) are \u201can unconvincing possi- bility\u201d and are treated as \u201clikely impossibilities\u201d within individual seismic zones. However, globally over the last decade such events keep occurring where PSHA pre- dicted seismic hazard to be low. Systematic comparison of the observed ground shaking with the expected one reported by the Global Seismic Hazard Assessment Program (GSHAP) maps discloses gross underestimation worldwide. Several inconsistencies with available observation are found also for national scale PSHA maps (including Italy), developed using updated data sets. As a result, the expected numbers of fatalities in recent disastrous earthquakes have been underestimated by these maps by approx- imately two to three orders of magnitude. The total death toll in 2000e2011 (which exceeds 700,000 people, including tsunami victims) calls for a critical reappraisal of GSHAP results, as well as of the underlying methods. In this chapter, we discuss the limits in the formulation and use of PSHA, addressing some theoretical and practical issues of seismic hazard assessment, which range from the overly simplified assumption that one could reduce the tensor problem of seismic- wave generation and propagation into a scalar problem (as implied by ground motion prediction equations), to the insufficient size and quality of earthquake catalogs for a reliable probability modeling at the local scale. Specific case studies are discussed, which may help to better understand the practical relevance of the mentioned issues. The aim is to present a critical overview of different approaches, analyses, and ob- servations in order to provide the readers with some general considerations and constructive ideas toward improved seismic hazard and effective risk assessment. Specifically, we show that seismic hazard analysis based on credible scenarios for real earthquakes, defined as neo-deterministic seismic hazard analysis, provides a robust alternative approach for seismic hazard and risk assessment. Therefore, it should be extensively tested as a suitable method for formulating scientifically sound and realistic public policy and building code practices
    corecore