1,792 research outputs found

    The 1997 Kagoshima (Japan) earthquake doublet: A quantitative analysis of aftershock rate changes

    Get PDF
    We quantitatively map relative rate changes for the aftershock sequence following the second mainshock of the 1997 earthquake doublet (M_W = 6.1, M_W = 6.0) in the Kagoshima province (Japan). Using the spatial distribution of the modified Omori law parameters for aftershocks that occurred during the 47.8 days between the two mainshocks, we forecast the aftershock activity in the next 50 days and compare it to the actually observed rates. The relative rate change map reveals four regions with statistically significant relative rate changes - three negative and one positive. “Our analysis suggests that the coseismic rate changes for off-fault aftershocks could be explained by changes in static stress. However, to explain the activation and deactivation of on-fault seismicity, other mechanism such as unusual crustal properties and the presence of abundant crustal fluids are required.

    A smoothed stochastic earthquake rate model considering seismicity and fault moment release for Europe

    Get PDF
    We present a time-independent gridded earthquake rate forecast for the European region including Turkey. The spatial component of our model is based on kernel density estimation techniques, which we applied to both past earthquake locations and fault moment release on mapped crustal faults and subduction zone interfaces with assigned slip rates. Our forecast relies on the assumption that the locations of past seismicity is a good guide to future seismicity, and that future large-magnitude events occur more likely in the vicinity of known faults. We show that the optimal weighted sum of the corresponding two spatial densities depends on the magnitude range considered. The kernel bandwidths and density weighting function are optimized using retrospective likelihood-based forecast experiments. We computed earthquake activity rates (a- and b-value) of the truncated Gutenberg-Richter distribution separately for crustal and subduction seismicity based on a maximum likelihood approach that considers the spatial and temporal completeness history of the catalogue. The final annual rate of our forecast is purely driven by the maximum likelihood fit of activity rates to the catalogue data, whereas its spatial component incorporates contributions from both earthquake and fault moment-rate densities. Our model constitutes one branch of the earthquake source model logic tree of the 2013 European seismic hazard model released by the EU-FP7 project ‘Seismic HAzard haRmonization in Europe' (SHARE) and contributes to the assessment of epistemic uncertainties in earthquake activity rates. We performed retrospective and pseudo-prospective likelihood consistency tests to underline the reliability of our model and SHARE's area source model (ASM) using the testing algorithms applied in the collaboratory for the study of earthquake predictability (CSEP). We comparatively tested our model's forecasting skill against the ASM and find a statistically significant better performance for testing periods of 10-20yr. The testing results suggest that our model is a viable candidate model to serve for long-term forecasting on timescales of years to decades for the European regio

    Statistical analysis of the induced Basel 2006 earthquake sequence: introducing a probability-based monitoring approach for Enhanced Geothermal Systems

    Get PDF
    Geothermal energy is becoming an important clean energy source, however, the stimulation of a reservoir for an Enhanced Geothermal System (EGS) is associated with seismic risk due to induced seismicity. Seismicity occurring due to the water injection at depth have to be well recorded and monitored. To mitigate the seismic risk of a damaging event, an appropriate alarm system needs to be in place for each individual experiment. In recent experiments, the so-called traffic-light alarm system, based on public response, local magnitude and peak ground velocity, was used. We aim to improve the pre-defined alarm system by introducing a probability-based approach; we retrospectively model the ongoing seismicity in real time with multiple statistical forecast models and then translate the forecast to seismic hazard in terms of probabilities of exceeding a ground motion intensity level. One class of models accounts for the water injection rate, the main parameter that can be controlled by the operators during an experiment. By translating the models into time-varying probabilities of exceeding various intensity levels, we provide tools which are well understood by the decision makers and can be used to determine thresholds non-exceedance during a reservoir stimulation; this, however, remains an entrepreneurial or political decision of the responsible project coordinators. We introduce forecast models based on the data set of an EGS experiment in the city of Basel. Between 2006 December 2 and 8, approximately 11 500 m3 of water was injected into a 5-km-deep well at high pressures. A six-sensor borehole array, was installed by the company Geothermal Explorers Limited (GEL) at depths between 300 and 2700 m around the well to monitor the induced seismicity. The network recorded approximately 11 200 events during the injection phase, more than 3500 of which were located. With the traffic-light system, actions where implemented after an ML 2.7 event, the water injection was reduced and then stopped after another ML 2.5 event. A few hours later, an earthquake with ML 3.4, felt within the city, occurred, which led to bleed-off of the well. A risk study was later issued with the outcome that the experiment could not be resumed. We analyse the statistical features of the sequence and show that the sequence is well modelled with the Omori-Utsu law following the termination of water injection. Based on this model, the sequence will last 31+29/−14 years to reach the background level. We introduce statistical models based on Reasenberg and Jones and Epidemic Type Aftershock Sequence (ETAS) models, commonly used to model aftershock sequences. We compare and test different model setups to simulate the sequences, varying the number of fixed and free parameters. For one class of the ETAS models, we account for the flow rate at the injection borehole. We test the models against the observed data with standard likelihood tests and find the ETAS model accounting for the on flow rate to perform best. Such a model may in future serve as a valuable tool for designing probabilistic alarm systems for EGS experiment

    Earthquake detection capability of the Swiss Seismic Network

    Get PDF
    A reliable estimate of completeness magnitudes is vital for many seismicity- and hazard-related studies. Here we adopted and further developed the Probability-based Magnitude of Completeness (PMC) method. This method determines network detection completeness (MP) using only empirical data: earthquake catalogue, phase picks and station information. To evaluate the applicability to low- or moderate-seismicity regions, we performed a case study in Switzerland. The Swiss Seismic Network (SSN) at present is recording seismicity with one of the densest networks of broad-band sensors in Europe. Based on data from 1983 January 1 to 2008 March 31, we found strong spatio-temporal variability of network completeness: the highest value of MP in Switzerland at present is 2.5 in the far southwest, close to the national boundary, whereas MP is lower than 1.6 in high-seismicity areas. Thus, events of magnitude 2.5 can be detected in all of Switzerland. We evaluated the temporal evolution of MP for the last 20 yr, showing the successful improvement of the SSN. We next introduced the calculation of uncertainties to the probabilistic method using a bootstrap approach. The results show that the uncertainties in completeness magnitudes are generally less than 0.1 magnitude units, implying that the method generates stable estimates of completeness magnitudes. We explored the possible use of PMC: (1) as a tool to estimate the number of missing earthquakes in moderate-seismicity regions and (2) as a network planning tool with simulation computations of installations of one or more virtual stations to assess the completeness and identify appropriate locations for new station installations. We compared our results with an existing study of the completeness based on detecting the point of deviation from a power law in the earthquake-size distribution. In general, the new approach provides higher estimates of the completeness magnitude than the traditional one. We associate this observation with the difference in the sensitivity of the two approaches in periods where the event detectability of the seismic networks is low. Our results allow us to move towards a full description of completeness as a function of space and time, which can be used for hazard-model development and forecast-model testing, showing an illustrative example of the applicability of the PMC method to regions with low to moderate seismicit

    Generation of photovoltage in graphene on a femtosecond time scale through efficient carrier heating

    Get PDF
    Graphene is a promising material for ultrafast and broadband photodetection. Earlier studies addressed the general operation of graphene-based photo-thermoelectric devices, and the switching speed, which is limited by the charge carrier cooling time, on the order of picoseconds. However, the generation of the photovoltage could occur at a much faster time scale, as it is associated with the carrier heating time. Here, we measure the photovoltage generation time and find it to be faster than 50 femtoseconds. As a proof-of-principle application of this ultrafast photodetector, we use graphene to directly measure, electrically, the pulse duration of a sub-50 femtosecond laser pulse. The observation that carrier heating is ultrafast suggests that energy from absorbed photons can be efficiently transferred to carrier heat. To study this, we examine the spectral response and find a constant spectral responsivity between 500 and 1500 nm. This is consistent with efficient electron heating. These results are promising for ultrafast femtosecond and broadband photodetector applications.Comment: 6 pages, 4 figure

    Binding, domain orientation, and dynamics of the Lck SH3-SH2 domain pair and comparison with other Src-family kinases

    Get PDF
    The catalytic activity of Src-family kinases is regulated by association with its SH3 and SH2 domains. Activation requires displacement of intermolecular contacts by SH3/SH2 binding ligands resulting in dissociation of the SH3 and SH2 domains from the kinase domain. To understand the contribution of the SH3-SH2 domain pair to this regulatory process, the binding of peptides derived from physiologically relevant SH2 and SH3 interaction partners was studied for Lck and its relative Fyn by NMR spectroscopy. In contrast to Fyn, activating ligands do not induce communication between SH2 and SH3 domains in Lck. This can be attributed to the particular properties of the Lck SH3-SH2 linker which is shown to be extremely flexible thus effectively decoupling the behavior of the SH3 and SH2 domains. Measurements on the SH32 tandem from Lck further revealed a relative domain orientation that is distinctly different from that found in the Lck SH32 crystal structure and in other Src kinases. These data suggest that flexibility between SH2 and SH3 domains contributes to the adaptation of Src-family kinases to specific environments and distinct functions
    corecore