1,378 research outputs found

    The 1997 Kagoshima (Japan) earthquake doublet: A quantitative analysis of aftershock rate changes

    Get PDF
    We quantitatively map relative rate changes for the aftershock sequence following the second mainshock of the 1997 earthquake doublet (M_W = 6.1, M_W = 6.0) in the Kagoshima province (Japan). Using the spatial distribution of the modified Omori law parameters for aftershocks that occurred during the 47.8 days between the two mainshocks, we forecast the aftershock activity in the next 50 days and compare it to the actually observed rates. The relative rate change map reveals four regions with statistically significant relative rate changes - three negative and one positive. “Our analysis suggests that the coseismic rate changes for off-fault aftershocks could be explained by changes in static stress. However, to explain the activation and deactivation of on-fault seismicity, other mechanism such as unusual crustal properties and the presence of abundant crustal fluids are required.

    Statistical analysis of the induced Basel 2006 earthquake sequence: introducing a probability-based monitoring approach for Enhanced Geothermal Systems

    Get PDF
    Geothermal energy is becoming an important clean energy source, however, the stimulation of a reservoir for an Enhanced Geothermal System (EGS) is associated with seismic risk due to induced seismicity. Seismicity occurring due to the water injection at depth have to be well recorded and monitored. To mitigate the seismic risk of a damaging event, an appropriate alarm system needs to be in place for each individual experiment. In recent experiments, the so-called traffic-light alarm system, based on public response, local magnitude and peak ground velocity, was used. We aim to improve the pre-defined alarm system by introducing a probability-based approach; we retrospectively model the ongoing seismicity in real time with multiple statistical forecast models and then translate the forecast to seismic hazard in terms of probabilities of exceeding a ground motion intensity level. One class of models accounts for the water injection rate, the main parameter that can be controlled by the operators during an experiment. By translating the models into time-varying probabilities of exceeding various intensity levels, we provide tools which are well understood by the decision makers and can be used to determine thresholds non-exceedance during a reservoir stimulation; this, however, remains an entrepreneurial or political decision of the responsible project coordinators. We introduce forecast models based on the data set of an EGS experiment in the city of Basel. Between 2006 December 2 and 8, approximately 11 500 m3 of water was injected into a 5-km-deep well at high pressures. A six-sensor borehole array, was installed by the company Geothermal Explorers Limited (GEL) at depths between 300 and 2700 m around the well to monitor the induced seismicity. The network recorded approximately 11 200 events during the injection phase, more than 3500 of which were located. With the traffic-light system, actions where implemented after an ML 2.7 event, the water injection was reduced and then stopped after another ML 2.5 event. A few hours later, an earthquake with ML 3.4, felt within the city, occurred, which led to bleed-off of the well. A risk study was later issued with the outcome that the experiment could not be resumed. We analyse the statistical features of the sequence and show that the sequence is well modelled with the Omori-Utsu law following the termination of water injection. Based on this model, the sequence will last 31+29/−14 years to reach the background level. We introduce statistical models based on Reasenberg and Jones and Epidemic Type Aftershock Sequence (ETAS) models, commonly used to model aftershock sequences. We compare and test different model setups to simulate the sequences, varying the number of fixed and free parameters. For one class of the ETAS models, we account for the flow rate at the injection borehole. We test the models against the observed data with standard likelihood tests and find the ETAS model accounting for the on flow rate to perform best. Such a model may in future serve as a valuable tool for designing probabilistic alarm systems for EGS experiment

    A smoothed stochastic earthquake rate model considering seismicity and fault moment release for Europe

    Get PDF
    We present a time-independent gridded earthquake rate forecast for the European region including Turkey. The spatial component of our model is based on kernel density estimation techniques, which we applied to both past earthquake locations and fault moment release on mapped crustal faults and subduction zone interfaces with assigned slip rates. Our forecast relies on the assumption that the locations of past seismicity is a good guide to future seismicity, and that future large-magnitude events occur more likely in the vicinity of known faults. We show that the optimal weighted sum of the corresponding two spatial densities depends on the magnitude range considered. The kernel bandwidths and density weighting function are optimized using retrospective likelihood-based forecast experiments. We computed earthquake activity rates (a- and b-value) of the truncated Gutenberg-Richter distribution separately for crustal and subduction seismicity based on a maximum likelihood approach that considers the spatial and temporal completeness history of the catalogue. The final annual rate of our forecast is purely driven by the maximum likelihood fit of activity rates to the catalogue data, whereas its spatial component incorporates contributions from both earthquake and fault moment-rate densities. Our model constitutes one branch of the earthquake source model logic tree of the 2013 European seismic hazard model released by the EU-FP7 project ‘Seismic HAzard haRmonization in Europe' (SHARE) and contributes to the assessment of epistemic uncertainties in earthquake activity rates. We performed retrospective and pseudo-prospective likelihood consistency tests to underline the reliability of our model and SHARE's area source model (ASM) using the testing algorithms applied in the collaboratory for the study of earthquake predictability (CSEP). We comparatively tested our model's forecasting skill against the ASM and find a statistically significant better performance for testing periods of 10-20yr. The testing results suggest that our model is a viable candidate model to serve for long-term forecasting on timescales of years to decades for the European regio

    Earthquake detection capability of the Swiss Seismic Network

    Get PDF
    A reliable estimate of completeness magnitudes is vital for many seismicity- and hazard-related studies. Here we adopted and further developed the Probability-based Magnitude of Completeness (PMC) method. This method determines network detection completeness (MP) using only empirical data: earthquake catalogue, phase picks and station information. To evaluate the applicability to low- or moderate-seismicity regions, we performed a case study in Switzerland. The Swiss Seismic Network (SSN) at present is recording seismicity with one of the densest networks of broad-band sensors in Europe. Based on data from 1983 January 1 to 2008 March 31, we found strong spatio-temporal variability of network completeness: the highest value of MP in Switzerland at present is 2.5 in the far southwest, close to the national boundary, whereas MP is lower than 1.6 in high-seismicity areas. Thus, events of magnitude 2.5 can be detected in all of Switzerland. We evaluated the temporal evolution of MP for the last 20 yr, showing the successful improvement of the SSN. We next introduced the calculation of uncertainties to the probabilistic method using a bootstrap approach. The results show that the uncertainties in completeness magnitudes are generally less than 0.1 magnitude units, implying that the method generates stable estimates of completeness magnitudes. We explored the possible use of PMC: (1) as a tool to estimate the number of missing earthquakes in moderate-seismicity regions and (2) as a network planning tool with simulation computations of installations of one or more virtual stations to assess the completeness and identify appropriate locations for new station installations. We compared our results with an existing study of the completeness based on detecting the point of deviation from a power law in the earthquake-size distribution. In general, the new approach provides higher estimates of the completeness magnitude than the traditional one. We associate this observation with the difference in the sensitivity of the two approaches in periods where the event detectability of the seismic networks is low. Our results allow us to move towards a full description of completeness as a function of space and time, which can be used for hazard-model development and forecast-model testing, showing an illustrative example of the applicability of the PMC method to regions with low to moderate seismicit

    How present am I: three virtual reality facilities testing the fear of falling

    Get PDF
    Virtual reality environments have long been used in studies related to architecture simulation. The main objective of this paper is to measure the sense of presence that different virtual reality devices provide to users so as to evaluate their effectiveness when used to simulate real environments and draw conclusions of people’s behaviours when using them. The study also aims at investigating, in a quantitative way, the influence of architectural elements on the comfort of use of a built environment, namely considering the fear of falling reported by adults while using these architectural elements. Using a between-subjects design randomly distributed between two experimental conditions (safe and unsafe), a set of three studies were conducted in three different virtual reality environments using a 5-sided-CAVE, a Powerwall or a Head Mounted Display. The study shows that immersive virtual reality devices give users a higher sense of presence than semi-immersive ones. One of the conclusions of the study is that a higher sense of presence helps to enhance the building spaces perceived impacts on users (in this case the fear of falling).info:eu-repo/semantics/publishedVersio

    Sensitivity study of forecasted aftershock seismicity based on Coulomb stress calculation and rate- and state-dependent frictional response

    Get PDF
    We use the Dieterich (1994) physics-based approach to simulate the spatio- temporal evolution of seismicity caused by stress changes applied to an infinite population of nucleating patches modeled through a rate- and state- dependent friction law. According to this model, seismicity rate changes depend on the amplitude of stress perturbation, the physical constitutive properties of faults (represented by the parameter Aσ), the stressing rate and the background seismicity rate of the study area. In order to apply this model in a predictive manner, we need to understand the impact of physical model parameters and the correlations between them. Firstly we discuss different definitions of the reference seismicity rate and show their impact on the computed rate of earthquake production for the 1992 Landers earthquake sequence as a case study. Furthermore, we demonstrate that all model parameters are strongly correlated for physical and statistical reasons. We discuss this correlation emphasizing that the estimations of the background seismicity rate, stressing rate and Aσ are strongly correlated to reproduce the observed aftershock productivity. Our analytically derived relation demonstrates the impact of these model parameters on the Omori-like aftershock decay: the c- value and the productivity of the Omori law, implying a p-value smaller or equal to 1. Finally, we discuss an optimal strategy to constrain model parameters for near-real time forecasts

    Characterization of extrasolar terrestrial planets from diurnal photometric variability

    Full text link
    The detection of massive planets orbiting nearby stars has become almost routine, but current techniques are as yet unable to detect terrestrial planets with masses comparable to the Earth's. Future space-based observatories to detect Earth-like planets are being planned. Terrestrial planets orbiting in the habitable zones of stars-where planetary surface conditions are compatible with the presence of liquid water-are of enormous interest because they might have global environments similar to Earth's and even harbor life. The light scattered by such a planet will vary in intensity and colour as the planet rotates; the resulting light curve will contain information about the planet's properties. Here we report a model that predicts features that should be discernible in light curves obtained by low-precision photometry. For extrasolar planets similar to Earth we expect daily flux variations up to hundreds of percent, depending sensitively on ice and cloud cover. Qualitative changes in surface or climate generate significant changes in the predicted light curves. This work suggests that the meteorological variability and the rotation period of an Earth-like planet could be derived from photometric observations. Other properties such as the composition of the surface (e.g., ocean versus land fraction), climate indicators (for example ice and cloud cover), and perhaps even signatures of Earth-like plant life could be constrained or possibly, with further study, even uniquely determined.Comment: Published in Nature. 9 pages including 3 figure

    Can We Map Asperities Using b-Values?

    Get PDF
    Can We Map Asperities Using b-Values
    • …
    corecore