688 research outputs found

    Inconsistencies and Lurking Pitfalls in the Magnitude–Frequency Distribution of High-Resolution Earthquake Catalogs

    Get PDF
    Earthquake catalogs describe the distribution of earthquakes in space, time, and magnitude, which is essential information for earthquake forecasting and the assessment of seismic hazard and risk. Available high‐resolution (HR) catalogs raise the expectation that their abundance of small earthquakes will help better characterize the fundamental scaling laws of statistical seismology. Here, we investigate whether the ubiquitous exponential‐like scaling relation for magnitudes (Gutenberg–Richter [GR], or its tapered version) can be straightforwardly extrapolated to the magnitude–frequency distribution (MFD) of HR catalogs. For several HR catalogs such as of the 2019 Ridgecrest sequence, the 2009 L’Aquila sequence, the 1992 Landers sequence, and entire southern California, we determine if the MFD agrees with an exponential‐like distribution using a statistical goodness‐of‐fit test. We find that HR catalogs usually do not preserve the exponential‐like MFD toward low magnitudes and depart from it. Surprisingly, HR catalogs that are based on advanced detection methods depart from an exponential‐like MFD at a similar magnitude level as network‐based HR catalogs. These departures are mostly due to an improper mixing of different magnitude types, spatiotemporal inhomogeneous completeness, or biased data recording or processing. Remarkably, common‐practice methods to find the completeness magnitude do not recognize these departures and lead to severe bias in the b‐value estimation. We conclude that extrapolating the exponential‐like GR relation to lower magnitudes cannot be taken for granted, and that HR catalogs pose subtle new challenges and lurking pitfalls that may hamper their proper use. The simplest solution to preserve the exponential‐like distribution toward low magnitudes may be to estimate a moment magnitude for each earthquake.This study was supported by the 'Real-time Earthquake Risk Reduction for a Resilient Europe' (RISE) project, funded by the European Union's Horizon 2020 research and innovation program (Grant Agreement No. 821115)

    BET_VH: a probabilistic tool for long-term volcanic hazard assessment

    Get PDF
    In this paper, we illustrate a Bayesian Event Tree to estimate Volcanic Hazard (BET_VH). The procedure enables us to calculate the probability of any kind of long-term hazardous event for which we are interested, accounting for the intrinsic stochastic nature of volcanic eruptions and our limited knowledge regarding related processes. For the input, the code incorporates results from numerical models simulating the impact of hazardous volcanic phenomena on an area and data from the eruptive history. For the output, the code provides a wide and exhaustive set of spatiotemporal probabilities of different events; these probabilities are estimated by means of a Bayesian approach that allows all uncertainties to be properly accounted for. The code is able to deal with many eruptive settings simultaneously, weighting each with its own probability of occurrence. In a companion paper, we give a detailed example of application of this tool to the Campi Flegrei caldera, in order to estimate the hazard from tephra fall. © The Author(s) 2010

    3-D spatial cluster analysis of seismic sequences through density-based algorithms

    Get PDF
    With seismic catalogues becoming progressively larger, extracting information becomes challenging and calls upon using sophisticated statistical analysis. Data are typically clustered by machine learning algorithms to find patterns or identify regions of interest that require further exploration. Here, we investigate two density-based clustering algorithms, DBSCAN and OPTICS, for their capability to analyse the spatial distribution of seismicity and their effectiveness in discovering highly active seismic volumes of arbitrary shapes in large data sets. In particular, we study the influence of varying input parameters on the cluster solutions. By exploring the parameter space, we identify a crossover region with optimal solutions in between two phases with opposite behaviours (i.e. only clustered and only unclustered data points). Using a synthetic case with various geometric structures, we find that solutions in the crossover region consistently have the largest clusters and best represent the individual structures. For identifying strong anisotropic structures, we illustrate the usefulness of data rescaling. Applying the clustering algorithms to seismic catalogues of recent earthquake sequences (2016 Central Italy and 2016 Kumamoto) confirms that cluster solutions in the crossover region are the best candidates to identify 3-D features of tectonic structures that were activated in a seismic sequence. Finally, we propose a list of recipes that generalizes our analyses to obtain such solutions for other seismic sequences

    Revealing the spatiotemporal complexity of the magnitude distribution and b-value during an earthquake sequence

    Get PDF
    The Magnitude–Frequency-Distribution (MFD) of earthquakes is typically modeled with the (tapered) Gutenberg–Richter relation. The main parameter of this relation, the b-value, controls the relative rate of small and large earthquakes. Resolving spatiotemporal variations of the b-value is critical to understanding the earthquake occurrence process and improving earthquake forecasting. However, this variation is not well understood. Here we present remarkable MFD variability during the complex 2016/17 central Italy sequence using a high-resolution earthquake catalog. Isolating seismically active volumes (‘clusters’) reveals that the MFD differed in nearby clusters, varied or remained constant in time depending on the cluster, and increased in b-value in the cluster where the largest earthquake eventually occurred. These findings suggest that the fault system’s heterogeneity and complexity influence the MFD. Our findings raise the question “b-value of what?”: interpreting and using MFD variability needs a spatiotemporal scale that is physically meaningful, like the one proposed here

    Retrospective Evaluation of the Five-Year and Ten-Year CSEP-Italy Earthquake Forecasts

    Full text link
    On 1 August 2009, the global Collaboratory for the Study of Earthquake Predictability (CSEP) launched a prospective and comparative earthquake predictability experiment in Italy. The goal of the CSEP-Italy experiment is to test earthquake occurrence hypotheses that have been formalized as probabilistic earthquake forecasts over temporal scales that range from days to years. In the first round of forecast submissions, members of the CSEP-Italy Working Group presented eighteen five-year and ten-year earthquake forecasts to the European CSEP Testing Center at ETH Zurich. We considered the twelve time-independent earthquake forecasts among this set and evaluated them with respect to past seismicity data from two Italian earthquake catalogs. In this article, we present the results of tests that measure the consistency of the forecasts with the past observations. Besides being an evaluation of the submitted time-independent forecasts, this exercise provided insight into a number of important issues in predictability experiments with regard to the specification of the forecasts, the performance of the tests, and the trade-off between the robustness of results and experiment duration. We conclude with suggestions for the future design of earthquake predictability experiments.Comment: 43 pages, 8 figures, 4 table

    A technical note on the bias in the estimation of the b-value and its uncertainty through the Least Squares technique

    Get PDF
    We investigate conceptually, analytically, and numerically the biases in the estimation of the b-value of the Gutenberg-Richter Law and of its uncertainty made through the least squares technique. The biases are introduced by the cumulation operation for the cumulative form of the Gutenberg-Richter Law, by the logarithmic transformation, and by the measurement errors on the magnitude. We find that the least squares technique, applied to the cumulative and binned form of the Gutenberg-Richter Law, produces strong bias in the b-value and its uncertainty, whose amplitudes depend on the size of the sample. Furthermore, the logarithmic transformation produces two different endemic bends in the Log(N) versus M curve. This means that this plot might produce fake significant departures from the Gutenberg-Richter Law. The effect of the measurement errors is negligible compared to those of cumulation operation and logarithmic transformation. The results obtained show that the least squares technique should never be used to determine the slope of the Gutenberg-Richter Law and its uncertainty

    A technical note on the bias in the estimation of the b-value and its uncertainty through the Least Squares technique

    Get PDF
    We investigate conceptually, analytically, and numerically the biases in the estimation of the b-value of the Gutenberg-Richter Law and of its uncertainty made through the least squares technique. The biases are introduced by the cumulation operation for the cumulative form of the Gutenberg-Richter Law, by the logarithmic transformation, and by the measurement errors on the magnitude. We find that the least squares technique, applied to the cumulative and binned form of the Gutenberg-Richter Law, produces strong bias in the b-value and its uncertainty, whose amplitudes depend on the size of the sample. Furthermore, the logarithmic transformation produces two different endemic bends in the Log(N) versus M curve. This means that this plot might produce fake significant departures from the Gutenberg-Richter Law. The effect of the measurement errors is negligible compared to those of cumulation operation and logarithmic transformation. The results obtained show that the least squares technique should never be used to determine the slope of the Gutenberg-Richter Law and its uncertainty

    ) a swarm at Strandline Lake, which continued for several years (1996-1999); and (5) deformation of Mt. Peulik (inflation begins after October

    Get PDF
    Abstract In the fall of 1996 the Alaska Volcano Observatory recorded an unprecedented level of seismic and volcanic activity. The following were observed: (1) a swarm at Iliamna Volcano (August 1996 to mid-1997); (2) an eruption at Pavlof Volcano (September 1996 to December 1996; (3) a swarm at Martin/Mageik volcanoes (October 1996); (4) a swarm at Strandline Lake, which continued for several years (1996)(1997)(1998)(1999); and (5) deformation of Mt. Peulik (inflation begins after October 1996 and ends in fall 1998), based on interpretation of interferometric synthetic aperture radar data. The number of monitored volcanic areas in 1996 was thirteen. We conducted two formal statistical tests to determine the likelihood of four of these events occurring randomly in the same time interval. The tests are based on different conceptual probabilistic models (classical and Bayesian) that embrace a wide range of realistic tectonic models. The first test considered only the areas in which swarms or eruptions occurred (7 of 13 if Strandline Lake is included; 6 of 12 otherwise), by comparing the real catalog with 10,000 synthetic catalogs under the assumption that the sites are independent. The second method is a hierarchical Bayesian model in which the probability of a swarm at each of the 13 (or 12) areas is different but the parent population is the same. We found that the likelihood of the swarms and eruption occurring nearly simultaneously by chance alone is small for a wide range of probabilistic schemes and, consequently, for different tectonic scenarios. Therefore, we conclude that the events may be related to a single process. We speculate that a widespread deformation pulse or strain transient occurred, mainly in the east half of the arc, and may be the most likely candidate for causing such nearly simultaneous events

    Improved Earthquake Forecasting Model Based on Long-term Memory in Earthquake

    Get PDF
    A prominent feature of earthquakes is their empirical laws including memory (clustering) in time and space. Several earthquake forecasting models, like the EpidemicType Aftershock Sequence (ETAS) model, were developed based on earthquake empirical laws. Yet, a recent study showed that the ETAS model fails in reproducing significant long-term memory characteristics found in real earthquake catalogs. Here we modify and generalize the ETAS model to include short- and long-term triggering mechanisms, to account for the short- and long-time memory (exponents) recently discovered in the data. Our generalized ETAS model reproduces accurately the short- and long-term/distance memory observed in the Italian and South California earthquake catalogs. The revised ETAS model is also found to significantly improve earthquake forecasting
    • 

    corecore