896 research outputs found

    Setting up an earthquake forecast experiment in Italy

    Get PDF
    We describe the setting up of the first earthquake forecasting experiment for Italy within the Collaboratory for the Study of Earthquake Predictability (CSEP). CSEP conducts rigorous and truly prospective forecast experiments for different tectonic environments in several forecast testing centers around the globe; forecasts are issued for a future period and also tested only against future observations to avoid any possible bias. As such, experiments need to be completely defined. This includes exact definitions of the testing area, of learning data for the forecast models, and of observation data against which forecasts will be tested to evaluate their performance. Here we present the rules, as taken from the Regional Earthquake Likelihood Models experiment and extended or changed for the Italian experiment. We also present characterizations of learning and observational catalogs that describe the completeness of these catalogs and illuminate inhomogeneities of magnitudes between these catalogs. A particular focus lies on the stability of earthquake recordings of the observational network. These catalog investigations provide guidance for CSEP modelers for developing earthquakes forecasts for submission to the forecast experiment in Italy

    Simultaneous Dependence of the Earthquake-Size Distribution on Faulting Style and Depth

    Get PDF
    We analyze two high-quality Southern Californian earthquake catalogues, one with focal mechanisms, to statistically model and test for dependencies of the earthquake-size distribution, the b values, on both faulting style and depth. In our null hypothesis, b is assumed constant. We then develop and calibrate one model based only on faulting style, another based only on depth dependence and two models that assume a simultaneous dependence on both parameters. We develop a new maximum-likelihood estimator corrected for the degrees of freedom to assess models' performances. Our results show that all models significantly reject the null hypothesis. The best performing is the one that simultaneously takes account of depth and faulting style. Our results suggest that differential stress variations in the Earth's crust systematically influence b values and that this variability should be considered for contemporary seismic hazard studies

    Crustal structure below Popocat\'epetl Volcano (Mexico) from analysis of Rayleigh waves

    Get PDF
    An array of ten broadband stations was installed on the Popocat\'epetl volcano (Mexico) for five months between October 2002 and February 2003. 26 regional and teleseismic earthquakes were selected and filtered in the frequency time domain to extract the fundamental mode of the Rayleigh wave. The average dispersion curve was obtained in two steps. Firstly, phase velocities were measured in the period range [2-50] s from the phase difference between pairs of stations, using Wiener filtering. Secondly, the average dispersion curve was calculated by combining observations from all events in order to reduce diffraction effects. The inversion of the mean phase velocity yielded a crustal model for the volcano which is consistent with previous models of the Mexican Volcanic Belt. The overall crustal structure beneath Popocat\'epetl is therefore not different from the surrounding area, and the velocities in the lower crust are confirmed to be relatively low. Lateral variations of the structure were also investigated by dividing the network into four parts and by applying the same procedure to each sub-array. No well-defined anomalies appeared for the two sub-arrays for which it was possible to measure a dispersion curve. However, dispersion curves associated with individual events reveal important diffraction for 6 s to 12 s periods which could correspond to strong lateral variations at 5 to 10 km depth

    Testing machine learning models for heuristic building damage assessment applied to the Italian Database of Observed Damage (DaDO)

    Get PDF
    Assessing or forecasting seismic damage to buildings is an essential issue for earthquake disaster management. In this study, we explore the efficacy of several machine learning models for damage characterization, trained and tested on the database of damage observed after Italian earthquakes (the Database of Observed Damage – DaDO). Six models were considered: regression- and classification-based machine learning models, each using random forest, gradient boosting, and extreme gradient boosting. The structural features considered were divided into two groups: all structural features provided by DaDO or only those considered to be the most reliable and easiest to collect (age, number of storeys, floor area, building height). Macroseismic intensity was also included as an input feature. The seismic damage per building was determined according to the EMS-98 scale observed after seven significant earthquakes occurring in several Italian regions. The results showed that extreme gradient boosting classification is statistically the most efficient method, particularly when considering the basic structural features and grouping the damage according to the traffic-light-based system used; for example, during the post-disaster period (green, yellow, and red), 68 % of buildings were correctly classified. The results obtained by the machine-learning-based heuristic model for damage assessment are of the same order of accuracy (error values were less than 17 %) as those obtained by the traditional RISK-UE method. Finally, the machine learning analysis found that the importance of structural features with respect to damage was conditioned by the level of damage considered.</p

    Evaluation of a Decade-Long Prospective Earthquake Forecasting Experiment in Italy

    Get PDF
    Earthquake forecasting models represent our current understanding of the physics and statistics that govern earthquake occurrence processes. Providing such forecasts as falsifiable statements can help us assess a model’s hypothesis to be, at the least, a plausible conjecture to explain the observations. Prospective testing (i.e., with future data, once the model and experiment have been fully specified) is fundamental in science because it enables confronting a model with completely out‐of‐sample data and zero degrees of freedom. Testing can also help inform decisions regarding the selection of models, data types, or procedures in practical applications, such as Probabilistic Seismic Hazard Analysis. In 2010, a 10‐year earthquake forecasting experiment began in Italy, where researchers collectively agreed on authoritative data sources, testing rules, and formats to independently evaluate a collection of forecasting models. Here, we test these models with ten years of fully prospective data using a multiscore approach to (1) identify the model features that correlate with data‐consistent or ‐inconsistent forecasts; (2) evaluate the stability of the experiment results over time; and (3) quantify the models’ limitations to generate spatial forecasts consistent with earthquake clustering. As each testing metric analyzes only limited properties of a forecast, the proposed synoptic analysis using multiple scores allows drawing more robust conclusions. Our results show that the best‐performing models use catalogs that span over 100 yr and incorporate fault information, demonstrating and quantifying the value of these data types. Model rankings are stable over time, suggesting that a 10‐year period in Italy can provide sufficient data to discriminate between optimal and suboptimal forecasts. Finally, no model can adequately describe spatial clustering, but those including fault information are less inconsistent with the observations. Prospective testing assesses relevant assumptions and hypotheses of earthquake processes truly out‐of‐sample, thus guiding model development and decision‐making to improve society’s earthquake resilience

    Universal Quake Statistics: From Compressed Nanocrystals to Earthquakes

    Get PDF
    Slowly-compressed single crystals, bulk metallic glasses (BMGs), rocks, granular materials, and the earth all deform via intermittent slips or “quakes”. We find that although these systems span 12 decades in length scale, they all show the same scaling behavior for their slip size distributions and other statistical properties. Remarkably, the size distributions follow the same power law multiplied with the same exponential cutoff. The cutoff grows with applied force for materials spanning length scales from nanometers to kilometers. The tuneability of the cutoff with stress reflects “tuned critical” behavior, rather than self-organized criticality (SOC), which would imply stress-independence. A simple mean field model for avalanches of slipping weak spots explains the agreement across scales. It predicts the observed slip-size distributions and the observed stress-dependent cutoff function. The results enable extrapolations from one scale to another, and from one force to another, across different materials and structures, from nanocrystals to earthquakes

    Prototype ATLAS IBL Modules using the FE-I4A Front-End Readout Chip

    Get PDF
    The ATLAS Collaboration will upgrade its semiconductor pixel tracking detector with a new Insertable B-layer (IBL) between the existing pixel detector and the vacuum pipe of the Large Hadron Collider. The extreme operating conditions at this location have necessitated the development of new radiation hard pixel sensor technologies and a new front-end readout chip, called the FE-I4. Planar pixel sensors and 3D pixel sensors have been investigated to equip this new pixel layer, and prototype modules using the FE-I4A have been fabricated and characterized using 120 GeV pions at the CERN SPS and 4 GeV positrons at DESY, before and after module irradiation. Beam test results are presented, including charge collection efficiency, tracking efficiency and charge sharing.Comment: 45 pages, 30 figures, submitted to JINS

    Search for squarks and gluinos in events with isolated leptons, jets and missing transverse momentum at s√=8 TeV with the ATLAS detector

    Get PDF
    The results of a search for supersymmetry in final states containing at least one isolated lepton (electron or muon), jets and large missing transverse momentum with the ATLAS detector at the Large Hadron Collider are reported. The search is based on proton-proton collision data at a centre-of-mass energy s√=8 TeV collected in 2012, corresponding to an integrated luminosity of 20 fb−1. No significant excess above the Standard Model expectation is observed. Limits are set on supersymmetric particle masses for various supersymmetric models. Depending on the model, the search excludes gluino masses up to 1.32 TeV and squark masses up to 840 GeV. Limits are also set on the parameters of a minimal universal extra dimension model, excluding a compactification radius of 1/R c = 950 GeV for a cut-off scale times radius (ΛR c) of approximately 30
    corecore