355 research outputs found

    Are megaquakes clustered?

    Full text link
    We study statistical properties of the number of large earthquakes over the past century. We analyze the cumulative distribution of the number of earthquakes with magnitude larger than threshold M in time interval T, and quantify the statistical significance of these results by simulating a large number of synthetic random catalogs. We find that in general, the earthquake record cannot be distinguished from a process that is random in time. This conclusion holds whether aftershocks are removed or not, except at magnitudes below M = 7.3. At long time intervals (T = 2-5 years), we find that statistically significant clustering is present in the catalog for lower magnitude thresholds (M = 7-7.2). However, this clustering is due to a large number of earthquakes on record in the early part of the 20th century, when magnitudes are less certain.Comment: 5 pages, 5 figure

    Harabe

    Get PDF
    Vecihi'nin İkdam'da tefrika edilen Harabe adlı roman

    Power Laws, Precursors and Predictability During Failure

    Full text link
    We investigate the dynamics of a modified Burridge-Knopoff model by introducing a dissipative term to mimic the bursts of acoustic emission (AE) from rock samples. The model explains many features of the statistics of AE signals observed in experiments such as the crossover in the exponent value from relatively small amplitude AE signals to larger regime, and their dependence on the pulling speed. Significantly, we find that the cumulative energy dissipated identified with acoustic emission can be used to predict a major slip event. We also find a data collapse of the acoustic activity for several major slip events describable by a universal stretched exponential with corrections in terms of time-to-failure.Comment: 7 pages, 6 figures, Final version with minor change

    Temporal changes in rock uplift rates of folds in the foreland of the Tian Shan and the Pamir from geodetic and geologic data

    Get PDF
    Understanding the evolution of continental deformation zones relies on quantifying spatial and temporal changes in deformation rates of tectonic structures. Along the eastern boundary of the Pamir‐Tian Shan collision zone, we constrain secular variations of rock uplift rates for a series of five Quaternary detachment‐ and fault‐related folds from their initiation to the modern day. When combined with GPS data, decomposition of interferometric synthetic aperture radar time series constrains the spatial pattern of surface and rock uplift on the folds deforming at decadal rates of 1–5 mm/yr. These data confirm the previously proposed basinward propagation of structures during the Quaternary. By fitting our geodetic rates and previously published geologic uplift rates with piecewise linear functions, we find that gradual rate changes over >100 kyr can explain the interferometric synthetic aperture radar observations where changes in average uplift rates are greater than ~1 mm/yr among different time intervals (~10¹, 10⁴‾⁵, and 10⁵‾⁶ years)

    Properties of Foreshocks and Aftershocks of the Non-Conservative SOC Olami-Feder-Christensen Model: Triggered or Critical Earthquakes?

    Get PDF
    Following Hergarten and Neugebauer [2002] who discovered aftershock and foreshock sequences in the Olami-Feder-Christensen (OFC) discrete block-spring earthquake model, we investigate to what degree the simple toppling mechanism of this model is sufficient to account for the properties of earthquake clustering in time and space. Our main finding is that synthetic catalogs generated by the OFC model share practically all properties of real seismicity at a qualitative level, with however significant quantitative differences. We find that OFC catalogs can be in large part described by the concept of triggered seismicity but the properties of foreshocks depend on the mainshock magnitude, in qualitative agreement with the critical earthquake model and in disagreement with simple models of triggered seismicity such as the Epidemic Type Aftershock Sequence (ETAS) model [Ogata, 1988]. Many other features of OFC catalogs can be reproduced with the ETAS model with a weaker clustering than real seismicity, i.e. for a very small average number of triggered earthquakes of first generation per mother-earthquake.Comment: revtex, 19 pages, 8 eps figure

    On the Occurrence of Finite-Time-Singularities in Epidemic Models of Rupture, Earthquakes and Starquakes

    Full text link
    We present a new kind of critical stochastic finite-time-singularity, relying on the interplay between long-memory and extreme fluctuations. We illustrate it on the well-established epidemic-type aftershock (ETAS) model for aftershocks, based solely on the most solidly documented stylized facts of seismicity (clustering in space and in time and power law Gutenberg-Richter distribution of earthquake energies). This theory accounts for the main observations (power law acceleration and discrete scale invariant structure) of critical rupture of heterogeneous materials, of the largest sequence of starquakes ever attributed to a neutron star as well as of earthquake sequences.Comment: Revtex document of 4 pages including 1 eps figur

    The critical earthquake concept applied to mine rockbursts with time-to-failure analysis

    Full text link
    We report new tests of the critical earthquake concepts performed on rockbursts in deep South African mines. We extend the concept of an optimal time and space correlation region and test it on the eight main shocks of our catalog provided by ISSI. In a first test, we use the simplest signature of criticality in terms of a power law time-to-failure formula. Notwithstanding the fact that the search for the optimal correlation size is performed with this simple power law, we find evidence both for accelerated seismicity and for the presence of logperiodic behavior with a prefered scaling factor close to 2. We then propose a new algorithm based on a space and time smoothing procedure, which is also intended to account for the finite range and time mechanical interactions between events. This new algorithm provides a much more robust and efficient construction of the optimal correlation region, which allows us the use of the logperiodic formula directly in the search process. In this preliminary work, we have only tested the new algorithm on the largest event on the catalog. The result is of remarkable good quality with a dramatic improvement in accuracy and robustness. This confirms the potential importance of logperiodic signals. Our study opens the road for an efficient implemention of a systematic testing procedure of real-time predictions.Comment: 22 pages, 32 figure

    Rupture by damage accumulation in rocks

    Get PDF
    The deformation of rocks is associated with microcracks nucleation and propagation, i.e. damage. The accumulation of damage and its spatial localization lead to the creation of a macroscale discontinuity, so-called "fault" in geological terms, and to the failure of the material, i.e. a dramatic decrease of the mechanical properties as strength and modulus. The damage process can be studied both statically by direct observation of thin sections and dynamically by recording acoustic waves emitted by crack propagation (acoustic emission). Here we first review such observations concerning geological objects over scales ranging from the laboratory sample scale (dm) to seismically active faults (km), including cliffs and rock masses (Dm, hm). These observations reveal complex patterns in both space (fractal properties of damage structures as roughness and gouge), time (clustering, particular trends when the failure approaches) and energy domains (power-law distributions of energy release bursts). We use a numerical model based on progressive damage within an elastic interaction framework which allows us to simulate these observations. This study shows that the failure in rocks can be the result of damage accumulation
    corecore