1,017 research outputs found

    Retrospective Evaluation of the Five-Year and Ten-Year CSEP-Italy Earthquake Forecasts

    Full text link
    On 1 August 2009, the global Collaboratory for the Study of Earthquake Predictability (CSEP) launched a prospective and comparative earthquake predictability experiment in Italy. The goal of the CSEP-Italy experiment is to test earthquake occurrence hypotheses that have been formalized as probabilistic earthquake forecasts over temporal scales that range from days to years. In the first round of forecast submissions, members of the CSEP-Italy Working Group presented eighteen five-year and ten-year earthquake forecasts to the European CSEP Testing Center at ETH Zurich. We considered the twelve time-independent earthquake forecasts among this set and evaluated them with respect to past seismicity data from two Italian earthquake catalogs. In this article, we present the results of tests that measure the consistency of the forecasts with the past observations. Besides being an evaluation of the submitted time-independent forecasts, this exercise provided insight into a number of important issues in predictability experiments with regard to the specification of the forecasts, the performance of the tests, and the trade-off between the robustness of results and experiment duration. We conclude with suggestions for the future design of earthquake predictability experiments.Comment: 43 pages, 8 figures, 4 table

    Adaptively Smoothed Seismicity Earthquake Forecasts for Italy

    Full text link
    We present a model for estimating the probabilities of future earthquakes of magnitudes m > 4.95 in Italy. The model, a slightly modified version of the one proposed for California by Helmstetter et al. (2007) and Werner et al. (2010), approximates seismicity by a spatially heterogeneous, temporally homogeneous Poisson point process. The temporal, spatial and magnitude dimensions are entirely decoupled. Magnitudes are independently and identically distributed according to a tapered Gutenberg-Richter magnitude distribution. We estimated the spatial distribution of future seismicity by smoothing the locations of past earthquakes listed in two Italian catalogs: a short instrumental catalog and a longer instrumental and historical catalog. The bandwidth of the adaptive spatial kernel is estimated by optimizing the predictive power of the kernel estimate of the spatial earthquake density in retrospective forecasts. When available and trustworthy, we used small earthquakes m>2.95 to illuminate active fault structures and likely future epicenters. By calibrating the model on two catalogs of different duration to create two forecasts, we intend to quantify the loss (or gain) of predictability incurred when only a short but recent data record is available. Both forecasts, scaled to five and ten years, were submitted to the Italian prospective forecasting experiment of the global Collaboratory for the Study of Earthquake Predictability (CSEP). An earlier forecast from the model was submitted by Helmstetter et al. (2007) to the Regional Earthquake Likelihood Model (RELM) experiment in California, and, with over half of the five-year experiment over, the forecast performs better than its competitors.Comment: revised manuscript. 22 pages, 3 figures, 2 table

    Magnetically Stabilized Nematic Order I: Three-Dimensional Bipartite Optical Lattices

    Full text link
    We study magnetically stabilized nematic order for spin-one bosons in optical lattices. We show that the Zeeman field-driven quantum transitions between non-nematic Mott states and quantum spin nematic states in the weak hopping limit are in the universality class of the ferromagnetic XXZ (S=1/2) spin model. We further discuss these transitions as condensation of interacting magnons. The development of O(2) nematic order when external fields are applied corresponds to condensation of magnons, which breaks a U(1) symmetry. Microscopically, this results from a coherent superposition of two non-nematic states at each individual site. Nematic order and spin wave excitations around critical points are studied and critical behaviors are obtained in a dilute gas approximation. We also find that spin singlet states are unstable with respect to quadratic Zeeman effects and Ising nematic order appears in the presence of any finite quadratic Zeeman coupling. All discussions are carried out for states in three dimensional bipartite lattices.Comment: 16 pages, 3 figure

    A prospective earthquake forecast experiment in the western Pacific

    Get PDF
    Since the beginning of 2009, the Collaboratory for the Study of Earthquake Predictability (CSEP) has been conducting an earthquake forecast experiment in the western Pacific. This experiment is an extension of the Kagan—Jackson experiments begun 15 years earlier and is a prototype for future global earthquake predictability experiments. At the beginning of each year, seismicity models make a spatially gridded forecast of the number of Mw≥ 5.8 earthquakes expected in the next year. For the three participating statistical models, we analyse the first two years of this experiment. We use likelihood-based metrics to evaluate the consistency of the forecasts with the observed target earthquakes and we apply measures based on Student's t-test and the Wilcoxon signed-rank test to compare the forecasts. Overall, a simple smoothed seismicity model (TripleS) performs the best, but there are some exceptions that indicate continued experiments are vital to fully understand the stability of these models, the robustness of model selection and, more generally, earthquake predictability in this region. We also estimate uncertainties in our results that are caused by uncertainties in earthquake location and seismic moment. Our uncertainty estimates are relatively small and suggest that the evaluation metrics are relatively robust. Finally, we consider the implications of our results for a global earthquake forecast experimen

    The 1997 Kagoshima (Japan) earthquake doublet: A quantitative analysis of aftershock rate changes

    Get PDF
    We quantitatively map relative rate changes for the aftershock sequence following the second mainshock of the 1997 earthquake doublet (M_W = 6.1, M_W = 6.0) in the Kagoshima province (Japan). Using the spatial distribution of the modified Omori law parameters for aftershocks that occurred during the 47.8 days between the two mainshocks, we forecast the aftershock activity in the next 50 days and compare it to the actually observed rates. The relative rate change map reveals four regions with statistically significant relative rate changes - three negative and one positive. “Our analysis suggests that the coseismic rate changes for off-fault aftershocks could be explained by changes in static stress. However, to explain the activation and deactivation of on-fault seismicity, other mechanism such as unusual crustal properties and the presence of abundant crustal fluids are required.

    Statistical analysis of the induced Basel 2006 earthquake sequence: introducing a probability-based monitoring approach for Enhanced Geothermal Systems

    Get PDF
    Geothermal energy is becoming an important clean energy source, however, the stimulation of a reservoir for an Enhanced Geothermal System (EGS) is associated with seismic risk due to induced seismicity. Seismicity occurring due to the water injection at depth have to be well recorded and monitored. To mitigate the seismic risk of a damaging event, an appropriate alarm system needs to be in place for each individual experiment. In recent experiments, the so-called traffic-light alarm system, based on public response, local magnitude and peak ground velocity, was used. We aim to improve the pre-defined alarm system by introducing a probability-based approach; we retrospectively model the ongoing seismicity in real time with multiple statistical forecast models and then translate the forecast to seismic hazard in terms of probabilities of exceeding a ground motion intensity level. One class of models accounts for the water injection rate, the main parameter that can be controlled by the operators during an experiment. By translating the models into time-varying probabilities of exceeding various intensity levels, we provide tools which are well understood by the decision makers and can be used to determine thresholds non-exceedance during a reservoir stimulation; this, however, remains an entrepreneurial or political decision of the responsible project coordinators. We introduce forecast models based on the data set of an EGS experiment in the city of Basel. Between 2006 December 2 and 8, approximately 11 500 m3 of water was injected into a 5-km-deep well at high pressures. A six-sensor borehole array, was installed by the company Geothermal Explorers Limited (GEL) at depths between 300 and 2700 m around the well to monitor the induced seismicity. The network recorded approximately 11 200 events during the injection phase, more than 3500 of which were located. With the traffic-light system, actions where implemented after an ML 2.7 event, the water injection was reduced and then stopped after another ML 2.5 event. A few hours later, an earthquake with ML 3.4, felt within the city, occurred, which led to bleed-off of the well. A risk study was later issued with the outcome that the experiment could not be resumed. We analyse the statistical features of the sequence and show that the sequence is well modelled with the Omori-Utsu law following the termination of water injection. Based on this model, the sequence will last 31+29/−14 years to reach the background level. We introduce statistical models based on Reasenberg and Jones and Epidemic Type Aftershock Sequence (ETAS) models, commonly used to model aftershock sequences. We compare and test different model setups to simulate the sequences, varying the number of fixed and free parameters. For one class of the ETAS models, we account for the flow rate at the injection borehole. We test the models against the observed data with standard likelihood tests and find the ETAS model accounting for the on flow rate to perform best. Such a model may in future serve as a valuable tool for designing probabilistic alarm systems for EGS experiment

    Earthquake detection capability of the Swiss Seismic Network

    Get PDF
    A reliable estimate of completeness magnitudes is vital for many seismicity- and hazard-related studies. Here we adopted and further developed the Probability-based Magnitude of Completeness (PMC) method. This method determines network detection completeness (MP) using only empirical data: earthquake catalogue, phase picks and station information. To evaluate the applicability to low- or moderate-seismicity regions, we performed a case study in Switzerland. The Swiss Seismic Network (SSN) at present is recording seismicity with one of the densest networks of broad-band sensors in Europe. Based on data from 1983 January 1 to 2008 March 31, we found strong spatio-temporal variability of network completeness: the highest value of MP in Switzerland at present is 2.5 in the far southwest, close to the national boundary, whereas MP is lower than 1.6 in high-seismicity areas. Thus, events of magnitude 2.5 can be detected in all of Switzerland. We evaluated the temporal evolution of MP for the last 20 yr, showing the successful improvement of the SSN. We next introduced the calculation of uncertainties to the probabilistic method using a bootstrap approach. The results show that the uncertainties in completeness magnitudes are generally less than 0.1 magnitude units, implying that the method generates stable estimates of completeness magnitudes. We explored the possible use of PMC: (1) as a tool to estimate the number of missing earthquakes in moderate-seismicity regions and (2) as a network planning tool with simulation computations of installations of one or more virtual stations to assess the completeness and identify appropriate locations for new station installations. We compared our results with an existing study of the completeness based on detecting the point of deviation from a power law in the earthquake-size distribution. In general, the new approach provides higher estimates of the completeness magnitude than the traditional one. We associate this observation with the difference in the sensitivity of the two approaches in periods where the event detectability of the seismic networks is low. Our results allow us to move towards a full description of completeness as a function of space and time, which can be used for hazard-model development and forecast-model testing, showing an illustrative example of the applicability of the PMC method to regions with low to moderate seismicit

    Size distribution of Parkfield's microearthquakes reflects changes in surface creep rate

    Get PDF
    The nucleation area of the series of M6 events in Parkfield has been shown to be characterized by low b-values throughout the seismic cycle. Since low b-values represent high differential stresses, the asperity structure seems to be always stably stressed and even unaffected by the latest main shock in 2004. However, because fault loading rates and applied shear stress vary with time, some degree of temporal variability of the b-value within stable blocks is to be expected. We discuss in this study adequate techniques and uncertainty treatment for a detailed analysis of the temporal evolution of b-values. We show that the derived signal for the Parkfield asperity correlates with changes in surface creep, suggesting a sensitive time resolution of the b-value stress meter, and confirming near-critical loading conditions within the Parkfield asperit

    A smoothed stochastic earthquake rate model considering seismicity and fault moment release for Europe

    Get PDF
    We present a time-independent gridded earthquake rate forecast for the European region including Turkey. The spatial component of our model is based on kernel density estimation techniques, which we applied to both past earthquake locations and fault moment release on mapped crustal faults and subduction zone interfaces with assigned slip rates. Our forecast relies on the assumption that the locations of past seismicity is a good guide to future seismicity, and that future large-magnitude events occur more likely in the vicinity of known faults. We show that the optimal weighted sum of the corresponding two spatial densities depends on the magnitude range considered. The kernel bandwidths and density weighting function are optimized using retrospective likelihood-based forecast experiments. We computed earthquake activity rates (a- and b-value) of the truncated Gutenberg-Richter distribution separately for crustal and subduction seismicity based on a maximum likelihood approach that considers the spatial and temporal completeness history of the catalogue. The final annual rate of our forecast is purely driven by the maximum likelihood fit of activity rates to the catalogue data, whereas its spatial component incorporates contributions from both earthquake and fault moment-rate densities. Our model constitutes one branch of the earthquake source model logic tree of the 2013 European seismic hazard model released by the EU-FP7 project ‘Seismic HAzard haRmonization in Europe' (SHARE) and contributes to the assessment of epistemic uncertainties in earthquake activity rates. We performed retrospective and pseudo-prospective likelihood consistency tests to underline the reliability of our model and SHARE's area source model (ASM) using the testing algorithms applied in the collaboratory for the study of earthquake predictability (CSEP). We comparatively tested our model's forecasting skill against the ASM and find a statistically significant better performance for testing periods of 10-20yr. The testing results suggest that our model is a viable candidate model to serve for long-term forecasting on timescales of years to decades for the European regio
    • …
    corecore