2,719 research outputs found

    Good practices in PSHA: declustering, b-value estimation, foreshocks and aftershocks inclusion; a case study in Italy

    Get PDF
    SUMMARYThe classical procedure of the probabilistic seismic hazard analysis (PSHA) requires a Poissonian distribution of earthquakes. Seismic catalogues follow a Poisson distribution just after the application of a declustering algorithm that leaves only one earthquake for each seismic sequence (usually the stronger, i.e. the main shock). Removing earthquakes from the seismic catalogues leads to underestimation of the annual rates of the events and consequently associate with low seismic hazard as indicated by several studies. In this study, we aim investigating the performance of two declustering methods on the Italian instrumental catalogue and the impact of declustering on estimation of the b-value and on the seismic hazard analysis. To this end, first the spatial variation in the seismicity rate was estimated from the declustered catalogues using the adaptive smoothed seismicity approach, considering small earthquakes (Mw ≥ 3.0). We then corrected the seismicity rates using new approach that allows for counting all events in the complete seismic catalogue by simply changing the magnitude frequency distribution. The impact of declustering on seismic hazard analysis is illustrated using PSHA maps in terms of peak ground acceleration and spectral acceleration in 2 s, with 10 per cent and 2 per cent probability of exceedance in 50 yr, for Italy. We observed that the hazard calculated from the declustered catalogues was always lower than the hazard computed using the complete catalogue. These results are in agreement with previous results obtained in different parts of the world

    Earthquake forecasting and seismic hazard analysis: some insights on the testing phase and the modeling

    Get PDF
    This thesis is divided in three chapters. In the first chapter we analyse the results of the world forecasting experiment run by the Collaboratory for the Study of Earthquake Predictability (CSEP). We take the opportunity of this experiment to contribute to the definition of a more robust and reliable statistical procedure to evaluate earthquake forecasting models. We first present the models and the target earthquakes to be forecast. Then we explain the consistency and comparison tests that are used in CSEP experiments to evaluate the performance of the models. Introducing a methodology to create ensemble forecasting models, we show that models, when properly combined, are almost always better performing that any single model. In the second chapter we discuss in depth one of the basic features of PSHA: the declustering of the seismicity rates. We first introduce the Cornell-McGuire method for PSHA and we present the different motivations that stand behind the need of declustering seismic catalogs. Using a theorem of the modern probability (Le Cam's theorem) we show that the declustering is not necessary to obtain a Poissonian behaviour of the exceedances that is usually considered fundamental to transform exceedance rates in exceedance probabilities in the PSHA framework. We present a method to correct PSHA for declustering, building a more realistic PSHA. In the last chapter we explore the methods that are commonly used to take into account the epistemic uncertainty in PSHA. The most widely used method is the logic tree that stands at the basis of the most advanced seismic hazard maps. We illustrate the probabilistic structure of the logic tree, and then we show that this structure is not adequate to describe the epistemic uncertainty. We then propose a new probabilistic framework based on the ensemble modelling that properly accounts for epistemic uncertainties in PSHA

    Evaluation of a Decade-Long Prospective Earthquake Forecasting Experiment in Italy

    Get PDF
    Earthquake forecasting models represent our current understanding of the physics and statistics that govern earthquake occurrence processes. Providing such forecasts as falsifiable statements can help us assess a model’s hypothesis to be, at the least, a plausible conjecture to explain the observations. Prospective testing (i.e., with future data, once the model and experiment have been fully specified) is fundamental in science because it enables confronting a model with completely out‐of‐sample data and zero degrees of freedom. Testing can also help inform decisions regarding the selection of models, data types, or procedures in practical applications, such as Probabilistic Seismic Hazard Analysis. In 2010, a 10‐year earthquake forecasting experiment began in Italy, where researchers collectively agreed on authoritative data sources, testing rules, and formats to independently evaluate a collection of forecasting models. Here, we test these models with ten years of fully prospective data using a multiscore approach to (1) identify the model features that correlate with data‐consistent or ‐inconsistent forecasts; (2) evaluate the stability of the experiment results over time; and (3) quantify the models’ limitations to generate spatial forecasts consistent with earthquake clustering. As each testing metric analyzes only limited properties of a forecast, the proposed synoptic analysis using multiple scores allows drawing more robust conclusions. Our results show that the best‐performing models use catalogs that span over 100 yr and incorporate fault information, demonstrating and quantifying the value of these data types. Model rankings are stable over time, suggesting that a 10‐year period in Italy can provide sufficient data to discriminate between optimal and suboptimal forecasts. Finally, no model can adequately describe spatial clustering, but those including fault information are less inconsistent with the observations. Prospective testing assesses relevant assumptions and hypotheses of earthquake processes truly out‐of‐sample, thus guiding model development and decision‐making to improve society’s earthquake resilience

    A simple two-state model interprets temporal modulations in eruptive activity and enhances multivolcano hazard quantification

    Get PDF
    Volcanic activity typically switches between high-activity states with many eruptions and low-activity states with few or no eruptions. We present a simple two-regime physics-informed statistical model that allows interpreting temporal modulations in eruptive activity. The model enhances comprehension and comparison of different volcanic systems and enables homogeneous integration into multivolcano hazard assessments that account for potential changes in volcanic regimes. The model satisfactorily fits the eruptive history of the three active volcanoes in the Neapolitan area, Italy (Mt. Vesuvius, Campi Flegrei, and Ischia) which encompass a wide range of volcanic behaviors. We find that these volcanoes have appreciably different processes for triggering and ending high-activity periods connected to different dominant volcanic processes controlling their eruptive activity, with different characteristic times and activity rates (expressed as number of eruptions per time interval). Presently, all three volcanoes are judged to be in a low-activity state, with decreasing probability of eruptions for Mt. Vesuvius, Ischia, and Campi Flegrei, respectively

    Prospective CSEP evaluation of 1-Day, 3-month, and 5-Yr earthquake forecasts for Italy

    Get PDF
    In 2009, the global Collaboratory for the Study of Earthquake Predictability (CSEP) launched three experiments to forecast the distribution of earthquakes in Italy in the subsequent 5 yrs. CSEP solicited forecasts for seismicity tomorrow, in the next three months, and for the entire 5 yrs. In those 5 yrs, the Istituto Nazionale di Geofisica e Vulcanologia (INGV) recorded 83 target earthquakes with local magnitude 3:95 =M <4:95, and 14 larger shocks. The results show that 1-day forecasts are consistent with the number and magnitudes of the target earthquakes, and one version of the epidemic-type aftershock sequence (ETAS) model is also consistent with the spatial distribution; ensemble forecasts, which we created for the 1-day experiment, are consistent with the number, locations, and magnitudes of the target earthquakes, and they perform as well as the best model; none of the 3-month time-independent models produce consistent forecasts; the best 5-yr models account for the fault distribution and the historical seismicity; and 5-yr models based on instrumental seismicity and b-value spatial variation show poor forecasting performance. Š 2018 Seismological Society of America. All rights reserved

    The SPTLC1 p.S331 mutation bridges sensory neuropathy and motor neuron disease and has implications for treatment

    Get PDF
    Aims SPTLC1-related disorder is a late onset sensory-autonomic neuropathy associated with perturbed sphingolipid homeostasis which can be improved by supplementation with the serine palmitoyl-CoA transferase (SPT) substrate, l-serine. Recently, a juvenile form of motor neuron disease has been linked to SPTLC1 variants. Variants affecting the p.S331 residue of SPTLC1 cause a distinct phenotype, whose pathogenic basis has not been established. This study aims to define the neuropathological and biochemical consequences of the SPTLC1 p.S331 variant, and test response to l-serine in this specific genotype. Methods We report clinical and neurophysiological characterisation of two unrelated children carrying distinct p.S331 SPTLC1 variants. The neuropathology was investigated by analysis of sural nerve and skin innervation. To clarify the biochemical consequences of the p.S331 variant, we performed sphingolipidomic profiling of serum and skin fibroblasts. We also tested the effect of l-serine supplementation in skin fibroblasts of patients with p.S331 mutations. Results In both patients, we recognised an early onset phenotype with prevalent progressive motor neuron disease. Neuropathology showed severe damage to the sensory and autonomic systems. Sphingolipidomic analysis showed the coexistence of neurotoxic deoxy-sphingolipids with an excess of canonical products of the SPT enzyme. l-serine supplementation in patient fibroblasts reduced production of toxic 1-deoxysphingolipids but further increased the overproduction of sphingolipids. Conclusions Our findings suggest that p.S331 SPTLC1 variants lead to an overlap phenotype combining features of sensory and motor neuropathies, thus proposing a continuum in the spectrum of SPTLC1-related disorders. l-serine supplementation in these patients may be detrimental

    The making of the NEAM Tsunami Hazard Model 2018 (NEAMTHM18)

    Get PDF
    The NEAM Tsunami Hazard Model 2018 (NEAMTHM18) is a probabilistic hazard model for tsunamis generated by earthquakes. It covers the coastlines of the North-eastern Atlantic, the Mediterranean, and connected seas (NEAM). NEAMTHM18 was designed as a three-phase project. The first two phases were dedicated to the model development and hazard calculations, following a formalized decision-making process based on a multiple-expert protocol. The third phase was dedicated to documentation and dissemination. The hazard assessment workflow was structured in Steps and Levels. There are four Steps: Step-1) probabilistic earthquake model; Step-2) tsunami generation and modeling in deep water; Step-3) shoaling and inundation; Step-4) hazard aggregation and uncertainty quantification. Each Step includes a different number of Levels. Level-0 always describes the input data; the other Levels describe the intermediate results needed to proceed from one Step to another. Alternative datasets and models were considered in the implementation. The epistemic hazard uncertainty was quantified through an ensemble modeling technique accounting for alternative models' weights and yielding a distribution of hazard curves represented by the mean and various percentiles. Hazard curves were calculated at 2,343 Points of Interest (POI) distributed at an average spacing of ∼20 km. Precalculated probability maps for five maximum inundation heights (MIH) and hazard intensity maps for five average return periods (ARP) were produced from hazard curves. In the entire NEAM Region, MIHs of several meters are rare but not impossible. Considering a 2% probability of exceedance in 50 years (ARP≈2,475 years), the POIs with MIH >5 m are fewer than 1% and are all in the Mediterranean on Libya, Egypt, Cyprus, and Greece coasts. In the North-East Atlantic, POIs with MIH >3 m are on the coasts of Mauritania and Gulf of Cadiz. Overall, 30% of the POIs have MIH >1 m. NEAMTHM18 results and documentation are available through the TSUMAPS-NEAM project website (http://www.tsumaps-neam.eu/), featuring an interactive web mapper. Although the NEAMTHM18 cannot substitute in-depth analyses at local scales, it represents the first action to start local and more detailed hazard and risk assessments and contributes to designing evacuation maps for tsunami early warning

    The Collaboratory for the Study of Earthquake Predictability:Achievements and Priorities

    Get PDF
    The Collaboratory for the Study of Earthquake Predictability (CSEP) is a global cyberinfrastructure for prospective evaluations of earthquake forecast models and prediction algorithms. CSEP’s goals are to improve our understanding of earthquake predictability, advance forecasting model development, test key scientific hypotheses and their predictive power, and improve seismic hazard assessments. Since its inception in California in 2007, the global CSEP collaboration has been conducting forecast experiments in a variety of tectonic settings and at a global scale and now operates four testing centers on four continents to automatically and objectively evaluate models against prospective data. These experiments have provided a multitude of results that are informing operational earthquake forecasting systems and seismic hazard models, and they have provided new and, sometimes, surprising insights into the predictability of earthquakes and spurned model improvements. CSEP has also conducted pilot studies to evaluate ground-motion and hazard models. Here, we report on selected achievements from a decade of CSEP, and we present our priorities for future activities.Published1305-13136T. Studi di pericolosità sismica e da maremotoJCR Journa

    Performance of CMS muon reconstruction in pp collision events at sqrt(s) = 7 TeV

    Get PDF
    The performance of muon reconstruction, identification, and triggering in CMS has been studied using 40 inverse picobarns of data collected in pp collisions at sqrt(s) = 7 TeV at the LHC in 2010. A few benchmark sets of selection criteria covering a wide range of physics analysis needs have been examined. For all considered selections, the efficiency to reconstruct and identify a muon with a transverse momentum pT larger than a few GeV is above 95% over the whole region of pseudorapidity covered by the CMS muon system, abs(eta) < 2.4, while the probability to misidentify a hadron as a muon is well below 1%. The efficiency to trigger on single muons with pT above a few GeV is higher than 90% over the full eta range, and typically substantially better. The overall momentum scale is measured to a precision of 0.2% with muons from Z decays. The transverse momentum resolution varies from 1% to 6% depending on pseudorapidity for muons with pT below 100 GeV and, using cosmic rays, it is shown to be better than 10% in the central region up to pT = 1 TeV. Observed distributions of all quantities are well reproduced by the Monte Carlo simulation.Comment: Replaced with published version. Added journal reference and DO
    • …
    corecore