25 research outputs found

    The physics of earthquake forecasting

    Get PDF
    The Coulomb stress theory is the basic physics principle upon which scientists rely for improving our understanding behind earthquake triggering processes and, therefore, our predictability of future earthquake hazards. The assumption that following a large earthquake the expected regional stress redistribution will affect other faults has been known since the late nineteenth century and has been passed on for further consideration by Charles Richter. However, we still struggle to define its implementation principles in short‐term forecasts. This opinion article discusses the recent advances in physics‐based earthquake forecasting to motivate an open discussion about what we have collectively learnt from the last 30 yr of published research on physics‐based forecasts and where future experiments should aim. If one considers that seismologists are aware of the connection between stress redistribution effects and seismicity response for decades, if not a century, then it is surprising that there is such a slow pace in understanding the physics of earthquake triggering. Looking at the rapid advancement of statistical forecasting, which was conceptualized by Ogata (1985, 1988, 1998) and now is the reference mathematical approximation of earthquake triggering processes, and then one would argue that physics had quite a head start but somewhere along the way slowed down. So what is so challenging in the realization of Coulomb stress theory? Is it implementation challenges, such as the different input data products required, or our limited understanding of earthquake triggering mechanisms? Segou and Parsons (2020) looked into past implementations while focusing on a systematic reassessment of Coulomb stress theory using the data‐rich M 7.2 El Mayor–Cucapah sequence. The evaluation of past hypotheses motivated the development of a new technique to forecast rupture styles of triggered seismicity. In the mind of the seismologists working on the issue today, elastic stress redistribution equals Coulomb stress change estimates. In the early 90s, there was an enthusiasm that the basic principle, namely coseismic stress changes, is the accurate operator for large‐magnitude aftershock prediction (Stein, 1999). More complex ideas were proposed supporting the role of the regional stress field priming the well oriented for failure faults while still attributing aftershock occurrence solely to coseismic stress changes (King et al., 1994). Two major assumptions were passed on from these early influential works; first, a coseismic stress triggering threshold of 0.01 MPa is required (Harris and Simpson, 1992), and second, the most hazardous faults in evolving aftershock sequences are the ones that maximize stress (King et al., 1994). The 1992 M 7.3 Landers cascade revolutionized not only the way seismologists thought about local aftershock patterns but also about remote dynamic triggering; in a seminal work Hill et al. (1993) described the far reach of this mainshock that increased seismic activity across much of the western United States. Around the same time, the rate‐and‐state laboratory‐confirmed law brought continuum mechanics into aftershock forecasts by describing triggered seismicity as a response to these estimated stress perturbations (Dieterich, 1994). By the early 2000s, scientific research related with remote dynamic triggering (e.g., Prejean and Hill, 2009) and borehole breakouts (Townend and Zoback, 2004) revealed that even minuscule stress changes from teleseismic waves can trigger seismicity and that crust is always in a critical state even in low‐strain rate intraplate regions. These results imply that active faults anywhere in the crust balance at the cusp of failure and even the smallest stress perturbations will lead to failure. A few years later, the improvement of regional networks allowed for global studies on remote dynamic triggering (e.g., Hill and Prejean, 2015) revealing that the magnitude of peak dynamic stresses is not the controlling factor behind triggering potential but the orientation of regional faults with respect to backazimuth of incoming waves play an important role in susceptibility (Parsons et al., 2014). More complex observations related to microearthquakes (Aiken and Peng, 2014) and tremor triggering suggested that low‐effective stress results in a relatively low‐triggering threshold around 2–3 kPa in central California (Peng et al., 2009). No matter how provocative these findings were, and still are, they did not change the implementation of Coulomb stress theory

    Prospective Earthquake Forecasts at the Himalayan Front after the 25 April 2015 M

    Full text link

    Prospective earthquake forecasts at the Himalayan Front after the 25 April 2015M 7.8 Gorkha Mainshock

    Get PDF
    When a major earthquake strikes, the resulting devastation can be compounded or even exceeded by the subsequent cascade of triggered seismicity. As the Nepalese recover from the 25 April 2015 shock, knowledge of what comes next is essential. We calculate the redistribution of crustal stresses and implied earthquake probabilities for different periods, from daily to 30 years into the future. An initial forecast was completed before an M 7.3 earthquake struck on 12 May 2015 that enables a preliminary assessment; postforecast seismicity has so far occurred within a zone of fivefold probability gain. Evaluation of the forecast performance, using two months of seismic data, reveals that stress‐based approaches present improved skill in higher‐magnitude triggered seismicity. Our results suggest that considering the total stress field, rather than only the coseismic one, improves the spatial performance of the model based on the estimation of a wide range of potential triggered faults following a mainshock

    Probabilistic Forecasting of Hydraulic Fracturing Induced Seismicity Using an Injection-Rate Driven ETAS Model

    Get PDF
    The development of robust forecasts of human‐induced seismicity is highly desirable to mitigate the effects of disturbing or damaging earthquakes. We assess the performance of a well‐established statistical model, the epidemic‐type aftershock sequence (ETAS) model, with a catalog of ∌93,000 microearthquakes observed at the Preston New Road (PNR, United Kingdom) unconventional shale gas site during, and after hydraulic fracturing of the PNR‐1z and PNR‐2 wells. Because ETAS was developed for slower loading rate tectonic seismicity, to account for seismicity caused by pressurized fluid, we also generate three modified ETAS with background rates proportional to injection rates. We find that (1) the standard ETAS captures low seismicity between and after injections but is outperformed by the modified model during high‐seismicity periods, and (2) the injection‐rate driven ETAS substantially improves when the forecast is calibrated on sleeve‐specific pumping data. We finally forecast out‐of‐sample the PNR‐2 seismicity using the average response to injection observed at PNR‐1z, achieving better predictive skills than the in‐sample standard ETAS. The insights from this study contribute toward producing informative seismicity forecasts for real‐time decision making and risk mitigation techniques during unconventional shale gas development

    The Predictive Skills of Elastic Coulomb Rate-and-state Aftershock Forecasts During the 2019 Ridgecrest, California, Earthquake Sequence

    Get PDF
    Operational earthquake forecasting protocols commonly use statistical models for their recognized ease of implementation and robustness in describing the short‐term spatiotemporal patterns of triggered seismicity. However, recent advances on physics‐based aftershock forecasting reveal comparable performance to the standard statistical counterparts with significantly improved predictive skills when fault and stress‐field heterogeneities are considered. Here, we perform a pseudoprospective forecasting experiment during the first month of the 2019 Ridgecrest (California) earthquake sequence. We develop seven Coulomb rate‐and‐state models that couple static stress‐change estimates with continuum mechanics expressed by the rate‐and‐state friction laws. Our model parameterization supports a gradually increasing complexity; we start from a preliminary model implementation with simplified slip distributions and spatially homogeneous receiver faults to reach an enhanced one featuring optimized fault constitutive parameters, finite‐fault slip models, secondary triggering effects, and spatially heterogenous planes informed by pre‐existing ruptures. The data‐rich environment of southern California allows us to test whether incorporating data collected in near‐real time during an unfolding earthquake sequence boosts our predictive power. We assess the absolute and relative performance of the forecasts by means of statistical tests used within the Collaboratory for the Study of Earthquake Predictability and compare their skills against a standard benchmark epidemic‐type aftershock sequence (ETAS) model for the short (24 hr after the two Ridgecrest mainshocks) and intermediate terms (one month). Stress‐based forecasts expect heightened rates along the whole near‐fault region and increased expected seismicity rates in central Garlock fault. Our comparative model evaluation not only supports that faulting heterogeneities coupled with secondary triggering effects are the most critical success components behind physics‐based forecasts, but also underlines the importance of model updates incorporating near‐real‐time available aftershock data reaching better performance than standard ETAS. We explore the physical basis behind our results by investigating the localized shut down of pre‐existing normal faults in the Ridgecrest near‐source area

    Tracing the Central Italy 2016-2017 seismic sequence fault system: insights from unsupervised Machine Learning and Principal Component Analysis

    Get PDF
    In recent years, we have witnessed the rise of Machine Learning (ML) in popularity and adoption across most scientific disciplines. The reasons behind this success are partly its versatility to adapt to different problems and types of data sets, the automatization of time-consuming repetitive tasks or its ability to learn complex relationships between observed variables. All of these make ML indispensable to the scientific discovery. In Seismology, ML has been applied to problems as different as earthquake detection and phase picking, signal classification, ground motion prediction or early warning systems development. In this work, we investigate a rich deep learning seismic catalogue from the Central Italy 2016-2017 seismic sequence (Tan et al., 2021) with the aim of identifying active faults and study their distribution and evolution over the duration of the sequence. The catalogue, built using a deep-neural-network based phase picker, includes over 900 000 earthquakes with moment magnitudes ranging from 0.5 to 6.2, of which 72 000 contain focal mechanism information (p.c. Meier, 2023). For our analysis, we combine unsupervised clustering algorithms such as DBSCAN, HDBSCAN or OPTICS with Principal Component Analysis (PCA). Our preliminary clustering results of the full, year-long, catalogue, as well as extracted month-, and week-long catalogues, with and without focal mechanisms, reveal the presence of high-density clusters of earthquakes of varying extent within a cloud of diffuse seismicity. Through PCA, we associate some of these high-density clusters to individual faults, highlighting the complexity of the fault system and showing how a multitude of faults, often small-scale, became active at different points of the seismic sequence

    SISMIKO: emergency network deployment and data sharing for the 2016 central Italy seismic sequence

    Get PDF
    At 01:36 UTC (03:36 local time) on August 24th 2016, an earthquake Mw 6.0 struck an extensive sector of the central Apennines (coordinates: latitude 42.70° N, longitude 13.23° E, 8.0 km depth). The earthquake caused about 300 casualties and severe damage to the historical buildings and economic activity in an area located near the borders of the Umbria, Lazio, Abruzzo and Marche regions. The Istituto Nazionale di Geofisica e Vulcanologia (INGV) located in few minutes the hypocenter near Accumoli, a small town in the province of Rieti. In the hours after the quake, dozens of events were recorded by the National Seismic Network (Rete Sismica Nazionale, RSN) of the INGV, many of which had a ML > 3.0. The density and coverage of the RSN in the epicentral area meant the epicenter and magnitude of the main event and subsequent shocks that followed it in the early hours of the seismic sequence were well constrained. However, in order to better constrain the localizations of the aftershock hypocenters, especially the depths, a denser seismic monitoring network was needed. Just after the mainshock, SISMIKO, the coordinating body of the emergency seismic network at INGV, was activated in order to install a temporary seismic network integrated with the existing permanent network in the epicentral area. From August the 24th to the 30th, SISMIKO deployed eighteen seismic stations, generally six components (equipped with both velocimeter and accelerometer), with thirteen of the seismic station transmitting in real-time to the INGV seismic monitoring room in Rome. The design and geometry of the temporary network was decided in consolation with other groups who were deploying seismic stations in the region, namely EMERSITO (a group studying site-effects), and the emergency Italian strong motion network (RAN) managed by the National Civil Protection Department (DPC). Further 25 BB temporary seismic stations were deployed by colleagues of the British Geological Survey (BGS) and the School of Geosciences, University of Edinburgh in collaboration with INGV. All data acquired from SISMIKO stations, are quickly available at the European Integrated Data Archive (EIDA). The data acquired by the SISMIKO stations were included in the preliminary analysis that was performed by the Bollettino Sismico Italiano (BSI), the Centro Nazionale Terremoti (CNT) staff working in Ancona, and the INGV-MI, described below

    Testing earthquake links in Mexico from 1978 to the 2017 M = 8.1 Chiapas and M = 7.1 Puebla Shocks

    Get PDF
    The M = 8.1 Chiapas and the M = 7.1 Puebla earthquakes occurred in the bending part of the subducting Cocos plate 11 days and ~600 km apart, a range that puts them well outside the typical aftershock zone. We find this to be a relatively common occurrence in Mexico, with 14% of M > 7.0 earthquakes since 1900 striking more than 300 km apart and within a 2 week interval, not different from a randomized catalog. We calculate the triggering potential caused by crustal stress redistribution from large subduction earthquakes over the last 40 years. There is no evidence that static stress transfer or dynamic triggering from the 8 September Chiapas earthquake promoted the 19 September earthquake. Both recent earthquakes were promoted by past thrust events instead, including delayed afterslip from the 2012 M = 7.5 Oaxaca earthquake. A repeated pattern of shallow thrust events promoting deep intraslab earthquakes is observed over the past 40 years

    Crustal permeability changes inferred from seismic attenuation: Impacts on multi-mainshock sequences

    Get PDF
    We use amplitude ratios from narrowband-filtered earthquake seismograms to measure variations of seismic attenuation over time, providing unique insights into the dynamic state of stress in the Earth’s crust at depth. Our dataset from earthquakes of the 2016–2017 Central Apennines sequence allows us to obtain high-resolution time histories of seismic attenuation (frequency band: 0.5–30 Hz) characterized by strong earthquake dilatation-induced fluctuations at seismogenic depths, caused by the cumulative elastic stress drop after the sequence, as well as damage-induced ones at shallow depths caused by energetic surface waves. Cumulative stress drop causes negative dilatation, reduced permeability, and seismic attenuation, whereas strong-motion surface waves produce an increase in crack density, and so in permeability and seismic attenuation. In the aftermath of the main shocks of the sequence, we show that the M ≄ 3.5 earthquake occurrence vs. time and distance is consistent with fluid diffusion: diffusion signatures are associated with changes in seismic attenuation during the first days of the Amatrice, Visso-Norcia, and Capitignano sub-sequences. We hypothesize that coseismic permeability changes create fluid diffusion pathways that are at least partly responsible for triggering multi-mainshock seismic sequences. Here we show that anelastic seismic attenuation fluctuates coherently with our hypothesis

    A comprehensive suite of earthquake catalogues for the 2016-2017 Central Italy seismic sequence

    Get PDF
    The protracted nature of the 2016-2017 central Italy seismic sequence, with multiple damaging earthquakes spaced over months, presented serious challenges for the duty seismologists and emergency managers as they assimilated the growing sequence to advise the local population. Uncertainty concerning where and when it was safe to occupy vulnerable structures highlighted the need for timely delivery of scientifically based understanding of the evolving hazard and risk. Seismic hazard assessment during complex sequences depends critically on up-to-date earthquake catalogues—i.e., data on locations, magnitudes, and activity of earthquakes—to characterize the ongoing seismicity and fuel earthquake forecasting models. Here we document six earthquake catalogues of this sequence that were developed using a variety of methods. The catalogues possess different levels of resolution and completeness resulting from progressive enhancements in the data availability, detection sensitivity, and hypocentral location accuracy. The catalogues range from real-time to advanced machine-learning procedures and highlight both the promises as well as the challenges of implementing advanced workflows in an operational environment
    corecore