2,730 research outputs found

    Good practices in PSHA: declustering, b-value estimation, foreshocks and aftershocks inclusion; a case study in Italy

    Get PDF
    SUMMARYThe classical procedure of the probabilistic seismic hazard analysis (PSHA) requires a Poissonian distribution of earthquakes. Seismic catalogues follow a Poisson distribution just after the application of a declustering algorithm that leaves only one earthquake for each seismic sequence (usually the stronger, i.e. the main shock). Removing earthquakes from the seismic catalogues leads to underestimation of the annual rates of the events and consequently associate with low seismic hazard as indicated by several studies. In this study, we aim investigating the performance of two declustering methods on the Italian instrumental catalogue and the impact of declustering on estimation of the b-value and on the seismic hazard analysis. To this end, first the spatial variation in the seismicity rate was estimated from the declustered catalogues using the adaptive smoothed seismicity approach, considering small earthquakes (Mw ≄ 3.0). We then corrected the seismicity rates using new approach that allows for counting all events in the complete seismic catalogue by simply changing the magnitude frequency distribution. The impact of declustering on seismic hazard analysis is illustrated using PSHA maps in terms of peak ground acceleration and spectral acceleration in 2 s, with 10 per cent and 2 per cent probability of exceedance in 50 yr, for Italy. We observed that the hazard calculated from the declustered catalogues was always lower than the hazard computed using the complete catalogue. These results are in agreement with previous results obtained in different parts of the world

    Earthquake forecasting and seismic hazard analysis: some insights on the testing phase and the modeling

    Get PDF
    This thesis is divided in three chapters. In the first chapter we analyse the results of the world forecasting experiment run by the Collaboratory for the Study of Earthquake Predictability (CSEP). We take the opportunity of this experiment to contribute to the definition of a more robust and reliable statistical procedure to evaluate earthquake forecasting models. We first present the models and the target earthquakes to be forecast. Then we explain the consistency and comparison tests that are used in CSEP experiments to evaluate the performance of the models. Introducing a methodology to create ensemble forecasting models, we show that models, when properly combined, are almost always better performing that any single model. In the second chapter we discuss in depth one of the basic features of PSHA: the declustering of the seismicity rates. We first introduce the Cornell-McGuire method for PSHA and we present the different motivations that stand behind the need of declustering seismic catalogs. Using a theorem of the modern probability (Le Cam's theorem) we show that the declustering is not necessary to obtain a Poissonian behaviour of the exceedances that is usually considered fundamental to transform exceedance rates in exceedance probabilities in the PSHA framework. We present a method to correct PSHA for declustering, building a more realistic PSHA. In the last chapter we explore the methods that are commonly used to take into account the epistemic uncertainty in PSHA. The most widely used method is the logic tree that stands at the basis of the most advanced seismic hazard maps. We illustrate the probabilistic structure of the logic tree, and then we show that this structure is not adequate to describe the epistemic uncertainty. We then propose a new probabilistic framework based on the ensemble modelling that properly accounts for epistemic uncertainties in PSHA

    Evaluation of a Decade-Long Prospective Earthquake Forecasting Experiment in Italy

    Get PDF
    Earthquake forecasting models represent our current understanding of the physics and statistics that govern earthquake occurrence processes. Providing such forecasts as falsifiable statements can help us assess a model’s hypothesis to be, at the least, a plausible conjecture to explain the observations. Prospective testing (i.e., with future data, once the model and experiment have been fully specified) is fundamental in science because it enables confronting a model with completely out‐of‐sample data and zero degrees of freedom. Testing can also help inform decisions regarding the selection of models, data types, or procedures in practical applications, such as Probabilistic Seismic Hazard Analysis. In 2010, a 10‐year earthquake forecasting experiment began in Italy, where researchers collectively agreed on authoritative data sources, testing rules, and formats to independently evaluate a collection of forecasting models. Here, we test these models with ten years of fully prospective data using a multiscore approach to (1) identify the model features that correlate with data‐consistent or ‐inconsistent forecasts; (2) evaluate the stability of the experiment results over time; and (3) quantify the models’ limitations to generate spatial forecasts consistent with earthquake clustering. As each testing metric analyzes only limited properties of a forecast, the proposed synoptic analysis using multiple scores allows drawing more robust conclusions. Our results show that the best‐performing models use catalogs that span over 100 yr and incorporate fault information, demonstrating and quantifying the value of these data types. Model rankings are stable over time, suggesting that a 10‐year period in Italy can provide sufficient data to discriminate between optimal and suboptimal forecasts. Finally, no model can adequately describe spatial clustering, but those including fault information are less inconsistent with the observations. Prospective testing assesses relevant assumptions and hypotheses of earthquake processes truly out‐of‐sample, thus guiding model development and decision‐making to improve society’s earthquake resilience

    A simple two-state model interprets temporal modulations in eruptive activity and enhances multivolcano hazard quantification

    Get PDF
    Volcanic activity typically switches between high-activity states with many eruptions and low-activity states with few or no eruptions. We present a simple two-regime physics-informed statistical model that allows interpreting temporal modulations in eruptive activity. The model enhances comprehension and comparison of different volcanic systems and enables homogeneous integration into multivolcano hazard assessments that account for potential changes in volcanic regimes. The model satisfactorily fits the eruptive history of the three active volcanoes in the Neapolitan area, Italy (Mt. Vesuvius, Campi Flegrei, and Ischia) which encompass a wide range of volcanic behaviors. We find that these volcanoes have appreciably different processes for triggering and ending high-activity periods connected to different dominant volcanic processes controlling their eruptive activity, with different characteristic times and activity rates (expressed as number of eruptions per time interval). Presently, all three volcanoes are judged to be in a low-activity state, with decreasing probability of eruptions for Mt. Vesuvius, Ischia, and Campi Flegrei, respectively

    Prospective CSEP evaluation of 1-Day, 3-month, and 5-Yr earthquake forecasts for Italy

    Get PDF
    In 2009, the global Collaboratory for the Study of Earthquake Predictability (CSEP) launched three experiments to forecast the distribution of earthquakes in Italy in the subsequent 5 yrs. CSEP solicited forecasts for seismicity tomorrow, in the next three months, and for the entire 5 yrs. In those 5 yrs, the Istituto Nazionale di Geofisica e Vulcanologia (INGV) recorded 83 target earthquakes with local magnitude 3:95 =M <4:95, and 14 larger shocks. The results show that 1-day forecasts are consistent with the number and magnitudes of the target earthquakes, and one version of the epidemic-type aftershock sequence (ETAS) model is also consistent with the spatial distribution; ensemble forecasts, which we created for the 1-day experiment, are consistent with the number, locations, and magnitudes of the target earthquakes, and they perform as well as the best model; none of the 3-month time-independent models produce consistent forecasts; the best 5-yr models account for the fault distribution and the historical seismicity; and 5-yr models based on instrumental seismicity and b-value spatial variation show poor forecasting performance. © 2018 Seismological Society of America. All rights reserved

    Role of the repeat expansion size in predicting age of onset and severity in RFC1 disease

    Get PDF
    RFC1 disease, caused by biallelic repeat expansion in RFC1, is clinically heterogeneous in terms of age of onset, disease progression and phenotype. We investigated the role of the repeat size in influencing clinical variables in RFC1 disease. We also assessed the presence and role of meiotic and somatic instability of the repeat. In this study, we identified 553 patients carrying biallelic RFC1 expansions and measured the repeat expansion size in 392 cases. Pearson's coefficient was calculated to assess the correlation between the repeat size and age at disease onset. A Cox model with robust cluster standard errors was adopted to describe the effect of repeat size on age at disease onset, on age at onset of each individual symptoms, and on disease progression. A quasi-poisson regression model was used to analyse the relationship between phenotype and repeat size. We performed multi-variate linear regression to assess the association of the repeat size with the degree of cerebellar atrophy. Meiotic stability was assessed by Southern blotting on first-degree relatives of 27 probands. Finally, somatic instability was investigated by optical genome mapping on cerebellar and frontal cortex and unaffected peripheral tissue from four post-mortem cases. A larger repeat size of both smaller and larger allele was associated with an earlier age at neurological onset (smaller allele HR=2.06, p<0.001; larger allele HR=1.53, p<0.001) and with a higher hazard of developing disabling symptoms, such as dysarthria or dysphagia (smaller allele HR=3.40, p<0.001; larger allele HR=1.71, p=0.002) or loss of independent walking (smaller allele HR=2.78, p<0.001; larger allele HR=1.60; p<0.001) earlier in disease course. Patients with more complex phenotypes carried larger expansions (smaller allele: complex neuropathy RR=1.30, p=0.003; CANVAS RR=1.34, p<0.001; larger allele: complex neuropathy RR=1.33, p=0.008; CANVAS RR=1.31, p=0.009). Furthermore, larger repeat expansions in the smaller allele were associated with more pronounced cerebellar vermis atrophy (lobules I-V ÎČ=-1.06, p<0.001; lobules VI-VII ÎČ=-0.34, p=?0.005). The repeat did not show significant instability during vertical transmission and across different tissues and brain regions. RFC1 repeat size, particularly of the smaller allele, is one of the determinants of variability in RFC1 disease and represents a key prognostic factor to predict disease onset, phenotype, and severity. Assessing the repeat size is warranted as part of the diagnostic test for RFC1 expansion.Funding: This work was supported by Medical Research Council (MR/T001712/1), Fondazione Cariplo (grant n. 2019-1836), the Inherited Neuropathy Consortium, and Fondazione Regionale per la Ricerca Biomedica (Regione Lombardia, project ID 1751723). R. CurrĂČ was supported by the European Academy of Neurology (EAN) Research Fellowship 2021. H. Houlden and M.M. Reilly thank the MRC, the Wellcome Trust, the MDA, MD UK, Ataxia UK, The MSA Trust, the Rosetrees Trust and the NIHR UCLH BRC for grant support. F. Taroni thanks the Fondazione Regionale per la Ricerca Biomedica (CP 20/2018 (Care4NeuroRare) and the Italian Ministry of Health (RC) for grant support. D. Pareyson thanks the Italian Ministry of Health (RRC) for grant support. F.M. Santorelli thanks Ricerca Corrente 2022 Ministero della Salute 5X1000 for grant support. M. Synofzik thanks the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) and the European Joint Programme on Rare Diseases for grant support. P.F. Chinnery the Medical Research Council Mitochondrial Biology Unit, the Medical Research Council (MRC) International Centre for Genomic Medicine in Neuromuscular Disease, the Leverhulme Trust (RPG-2018-408), the Medical Research Council, the Alzheimer's Society Project, and the NIHR Cambridge Biomedical Research for grant support. Acknowledgements: We thank the patients and relatives who participated in this study

    Probabilistic tsunami forecasting for early warning

    Get PDF
    Tsunami warning centres face the challenging task of rapidly forecasting tsunami threat immediately after an earthquake, when there is high uncertainty due to data deficiency. Here we introduce Probabilistic Tsunami Forecasting (PTF) for tsunami early warning. PTF expli- citly treats data- and forecast-uncertainties, enabling alert level definitions according to any predefined level of conservatism, which is connected to the average balance of missed-vs- false-alarms. Impact forecasts and resulting recommendations become progressively less uncertain as new data become available. Here we report an implementation for near-source early warning and test it systematically by hindcasting the great 2010 M8.8 Maule (Chile) and the well-studied 2003 M6.8 Zemmouri-Boumerdes (Algeria) tsunamis, as well as all the Mediterranean earthquakes that triggered alert messages at the Italian Tsunami Warning Centre since its inception in 2015, demonstrating forecasting accuracy over a wide range of magnitudes and earthquake types

    The SPTLC1 p.S331 mutation bridges sensory neuropathy and motor neuron disease and has implications for treatment

    Get PDF
    Aims SPTLC1-related disorder is a late onset sensory-autonomic neuropathy associated with perturbed sphingolipid homeostasis which can be improved by supplementation with the serine palmitoyl-CoA transferase (SPT) substrate, l-serine. Recently, a juvenile form of motor neuron disease has been linked to SPTLC1 variants. Variants affecting the p.S331 residue of SPTLC1 cause a distinct phenotype, whose pathogenic basis has not been established. This study aims to define the neuropathological and biochemical consequences of the SPTLC1 p.S331 variant, and test response to l-serine in this specific genotype. Methods We report clinical and neurophysiological characterisation of two unrelated children carrying distinct p.S331 SPTLC1 variants. The neuropathology was investigated by analysis of sural nerve and skin innervation. To clarify the biochemical consequences of the p.S331 variant, we performed sphingolipidomic profiling of serum and skin fibroblasts. We also tested the effect of l-serine supplementation in skin fibroblasts of patients with p.S331 mutations. Results In both patients, we recognised an early onset phenotype with prevalent progressive motor neuron disease. Neuropathology showed severe damage to the sensory and autonomic systems. Sphingolipidomic analysis showed the coexistence of neurotoxic deoxy-sphingolipids with an excess of canonical products of the SPT enzyme. l-serine supplementation in patient fibroblasts reduced production of toxic 1-deoxysphingolipids but further increased the overproduction of sphingolipids. Conclusions Our findings suggest that p.S331 SPTLC1 variants lead to an overlap phenotype combining features of sensory and motor neuropathies, thus proposing a continuum in the spectrum of SPTLC1-related disorders. l-serine supplementation in these patients may be detrimental

    The making of the NEAM Tsunami Hazard Model 2018 (NEAMTHM18)

    Get PDF
    The NEAM Tsunami Hazard Model 2018 (NEAMTHM18) is a probabilistic hazard model for tsunamis generated by earthquakes. It covers the coastlines of the North-eastern Atlantic, the Mediterranean, and connected seas (NEAM). NEAMTHM18 was designed as a three-phase project. The first two phases were dedicated to the model development and hazard calculations, following a formalized decision-making process based on a multiple-expert protocol. The third phase was dedicated to documentation and dissemination. The hazard assessment workflow was structured in Steps and Levels. There are four Steps: Step-1) probabilistic earthquake model; Step-2) tsunami generation and modeling in deep water; Step-3) shoaling and inundation; Step-4) hazard aggregation and uncertainty quantification. Each Step includes a different number of Levels. Level-0 always describes the input data; the other Levels describe the intermediate results needed to proceed from one Step to another. Alternative datasets and models were considered in the implementation. The epistemic hazard uncertainty was quantified through an ensemble modeling technique accounting for alternative models' weights and yielding a distribution of hazard curves represented by the mean and various percentiles. Hazard curves were calculated at 2,343 Points of Interest (POI) distributed at an average spacing of ∌20 km. Precalculated probability maps for five maximum inundation heights (MIH) and hazard intensity maps for five average return periods (ARP) were produced from hazard curves. In the entire NEAM Region, MIHs of several meters are rare but not impossible. Considering a 2% probability of exceedance in 50 years (ARP≈2,475 years), the POIs with MIH >5 m are fewer than 1% and are all in the Mediterranean on Libya, Egypt, Cyprus, and Greece coasts. In the North-East Atlantic, POIs with MIH >3 m are on the coasts of Mauritania and Gulf of Cadiz. Overall, 30% of the POIs have MIH >1 m. NEAMTHM18 results and documentation are available through the TSUMAPS-NEAM project website (http://www.tsumaps-neam.eu/), featuring an interactive web mapper. Although the NEAMTHM18 cannot substitute in-depth analyses at local scales, it represents the first action to start local and more detailed hazard and risk assessments and contributes to designing evacuation maps for tsunami early warning
    • 

    corecore