104 research outputs found
Autonomous decision-making against induced seismicity in deep fluid injections
The rise in the frequency of anthropogenic earthquakes due to deep fluid
injections is posing serious economic, societal, and legal challenges to
geo-energy and waste-disposal projects. We propose an actuarial approach to
mitigate this risk, first by defining an autonomous decision-making process
based on an adaptive traffic light system (ATLS) to stop risky injections, and
second by quantifying a "cost of public safety" based on the probability of an
injection-well being abandoned. The ATLS underlying statistical model is first
confirmed to be representative of injection-induced seismicity, with examples
taken from past reservoir stimulation experiments (mostly from Enhanced
Geothermal Systems, EGS). Then the decision strategy is formalized: Being
integrable, the model yields a closed-form ATLS solution that maps a risk-based
safety standard or norm to an earthquake magnitude not to exceed during
stimulation. Finally, the EGS levelized cost of electricity (LCOE) is
reformulated in terms of null expectation, with the cost of abandoned
injection-well implemented. We find that the price increase to mitigate the
increased seismic risk in populated areas can counterbalance the heat credit.
However this "public safety cost" disappears if buildings are based on
earthquake-resistant designs or if a more relaxed risk safety standard or norm
is chosen.Comment: 8 pages, 4 figures, conference (International Symposium on Energy
Geotechnics, 26-28 September 2018, Lausanne, Switzerland
An observational test of the origin of accelerating moment release before large earthquakes
International audience[1] A progressive increase of seismic activity distributed over a wide region around a future earthquake epicenter is termed accelerating moment release (AMR). This phenomenon has been observed in several studies over the last 15 years, although there is no consensus about the physical origin of the effect. In a recent hypothesis known as the stress accumulation (SA) model, the AMR is thought to result from the last stage of loading in the earthquake cycle. In this view, the increasing seismicity is due to minor stress release as the whole region becomes sufficiently stressed for the major event to occur. The stress accumulation model makes specific predictions about the distribution of events in an AMR sequence. Because the AMR is predicted to be a result of loading on the main fault, the precursory activity should be concentrated in the positive lobes of the far-field stresses calculated by a backslip dislocation model of the main shock. To test this model, AMR is first found in optimal circular regions around the epicenters of each of the M w ! 6.5 earthquakes in central and southern California since 1950. A backslip dislocation model is then used to determine which of the precursory events occur in the regions predicted by stress accumulation. AMR is shown to occur preferentially in the lobes of the backslip stress field predicted by the stress accumulation model
The frequency-size scaling of non-volcanic tremors beneath the San Andreas Fault at Parkfield: Possible implications for seismic energy release
© 2019 Elsevier B.V. We analyse the frequency-size-distribution of non-volcanic tremors observed along the Parkfield section of the San Andreas Fault. We suggest that these non-volcanic tremors follow a power-law scaling typical of scale-invariant, stick slip tectonic earthquakes, but with an unusually high scaling exponent of more than 2.0 and a systematic depth-dependency. While each individual non-volcanic tremor releases only a minuscule amount of energy and slip, this is more than compensated by their sheer numbers. Consequently, the integrated contribution of this largely âinvisibleâ seismicity (non-volcanic tremors and nano-earthquakes) is non-negligible and could potentially account in selected patches along the San Andreas fault for up to 100% of the plate motion
MATRIX Results II and Reference Report
As populations increase, especially in urban areas, the number of people affected by natural hazards is growing, as many regions of the world subject to multiple hazards. Although the volume of geophysical, sociological and economic knowledge is expanding, so are the losses from natural catastrophes. The slow transfer of appropriate knowledge from theory to practice may be due to the difficulties inherent in the communication process from science to policy-making, including perceptions by stakeholders from disaster mitigation practice regarding the usability of any developed tools. As scientific evidence shows, decision-makers are faced with the challenge of not only mitigating against single hazards and risks, but also multiple risks, which must include the consideration of their interrelations. As the multi-hazard and risk concept is a relatively young area of natural risk governance, there are only a few multi-risk models and the experience of practitioners as to how to use these models is limited. To our knowledge, scientific literature on stakeholders' perceptions of multi-risk models is lacking. In this document, we identify the perceptions of two decision-making tools, which involve multi-hazard and multi-risk. The first one is a generic, multi-risk framework based on the sequential Monte Carlo method to allow for a straightforward and flexible implementation of hazard interactions which may occur in a complex system. The second is a decision-making tool that integrates directly input from stakeholders by attributing weights to different components and constructing risk ratings. Based on the feedback from stakeholders, we found that interest in multi-risk assessment is high, but that its application remains hampered by the complexity of the processes involved
Detailed Investigation of the Foreshock Sequence of the 2010 Mw7.2 El MayorâCucapah Earthquake
Foreshocks can provide valuable information about possible nucleation process of a mainshock. However, their physical mechanisms are still under debate. In this study, we present a comprehensive analysis of the earthquake sequence preceding the 2010 Mw7.2 El MayorâCucapah mainshock, including waveform detection of missing smaller events, relative relocation, and source parameter analysis. Based on a template matching method, we find a tenfold increase in the number of earthquakes than reported in the Southern California Seismic Network catalog. The entire sequence exhibits nearly continuous episodes of foreshocks that can be loosely separated into two active clusters. Relocated foreshocks show several seismicity streaks at depth, with a consistently active cluster at depths between 14 and 16Â km where the mainshock was nucleated. Stress drop measurements from a spectral ratio approach based on empirical Greenâs functions show a range between 3.8 and 41.7Â MPa with a median of 13.0Â MPa and no clear temporal variations. The relocation results, together with the source patches estimated from earthquake corner frequencies, revealed a migration front toward the mainshock hypocenter within last 8Â hr and a chain of active burst immediately 6Â min prior to the mainshock. Our results support combined effects of aseismic slip and cascading failure on the evolution of foreshocks.Plain Language SummaryThe 2010 Mw7.2 El MayorâCucapah (EMC) earthquake was preceded by a prominent sequence of foreshocks starting ~21Â days before the mainshock. Several methods based on the similarities of waveforms are applied to obtain spatiotemporal evolution of foreshocks. Ten times more events are found from a template matching method when compared to the SCSN catalog. The refined relative locations reveal two main active clusters in time, as well as two spatial patches with a shallower one to the north of the mainshock epicenter. The depth distribution indicates several linear lines of seismicity, with a consistently active cluster at depths of 14â16Â km where mainshock started. An active cluster of foreshocks occurred in the last 6Â min. They likely altered the stress state near the hypocenter and ultimately triggered the mainshock. Our analysis indicates that both aseismic slip and cascade triggering processes occurred and contributed to the eventual triggering of the EMC mainshock.Key PointsA waveform matching technique leads to tenfold increase in the number of foreshocks when compared with the SCSN catalogWe resolve the corner frequency of 20 foreshocks using the detected events as empirical Greenâs functionsThe relocated catalog and estimated source patches reveal effects of both aseismic slip and cascading stress transferPeer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/155988/1/jgrb54188.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/155988/2/jgrb54188_am.pd
Dragon-kings: mechanisms, statistical methods and empirical evidence
This introductory article presents the special Discussion and Debate volume
"From black swans to dragon-kings, is there life beyond power laws?" published
in Eur. Phys. J. Special Topics in May 2012. We summarize and put in
perspective the contributions into three main themes: (i) mechanisms for
dragon-kings, (ii) detection of dragon-kings and statistical tests and (iii)
empirical evidence in a large variety of natural and social systems. Overall,
we are pleased to witness significant advances both in the introduction and
clarification of underlying mechanisms and in the development of novel
efficient tests that demonstrate clear evidence for the presence of
dragon-kings in many systems. However, this positive view should be balanced by
the fact that this remains a very delicate and difficult field, if only due to
the scarcity of data as well as the extraordinary important implications with
respect to hazard assessment, risk control and predictability.Comment: 20 page
Recommended from our members
A novel Multiple-Expert Protocol to manage uncertainty and subjective choices in probabilistic single and multi-hazard risk analyses
Data availability:
No data was used for the research described in the article.Integrating diverse expert opinions in hazard and risk projects is essential to managing subjective decisions and quantifying uncertainty to produce stable and trustworthy results. A structured procedure is necessary to organize the gathering of experts' opinions while ensuring transparency, accountability, and independence in judgements. We propose a novel Multiple-Expert management Protocol (MEP) to address this challenge, providing procedural guidelines for conducting single to multi-hazard risk analyses. MEP establishes a workflow to manage subjectivity rooted in (i) moderated and staged group interactions, (ii) trackable blind advice through written elicitations with mathematical aggregation, (iii) participatory independent review, (iv) close cooperation between scientific and managerial coordination, and (v) proper and comprehensive documentation. Originally developed for stress testing critical infrastructure, MEP is designed as a single, flexible, technology-neutral procedural workflow applicable to various sectors. Moreover, its scalability allows it to adapt from high to low-budget projects and from complex probabilistic multi-hazard risk assessments to standard single-hazard analyses, with different experts' degree and type of involvement depending on available funding and emerging controversies. We present two compelling case studies to showcase MEP's practical applicability: a multi-hazard risk analysis for a port infrastructure and a single-hazard regional tsunami hazard assessment.European Community's Seventh Framework Programme under Grant Agreement No. 603389 & Mechanism of the European Civil Protection and Humanitarian Aid Operations with grant no. ECHO/SUB/2015/718568/PREV26 (https://ec.europa.eu/echo/funding-evaluations/financing-civil-protection-europe/selected-projects/probabilistic-tsunami-hazard_en)
- âŠ