95 research outputs found

    Dragon-kings: Mechanisms, statistical methods and empirical evidence

    Get PDF
    This introductory article presents the special Discussion and Debate volume "From black swans to dragon-kings, is there life beyond power laws?” We summarize and put in perspective the contributions into three main themes: (i) mechanisms for dragon-kings, (ii) detection of dragon-kings and statistical tests and (iii) empirical evidence in a large variety of natural and social systems. Overall, we are pleased to witness significant advances both in the introduction and clarification of underlying mechanisms and in the development of novel efficient tests that demonstrate clear evidence for the presence of dragon-kings in many systems. However, this positive view should be balanced by the fact that this remains a very delicate and difficult field, if only due to the scarcity of data as well as the extraordinary important implications with respect to hazard assessment, risk control and predictabilit

    Automatic Reconstruction of Fault Networks from Seismicity Catalogs: 3D Optimal Anisotropic Dynamic Clustering

    Get PDF
    We propose a new pattern recognition method that is able to reconstruct the 3D structure of the active part of a fault network using the spatial location of earthquakes. The method is a generalization of the so-called dynamic clustering method, that originally partitions a set of datapoints into clusters, using a global minimization criterion over the spatial inertia of those clusters. The new method improves on it by taking into account the full spatial inertia tensor of each cluster, in order to partition the dataset into fault-like, anisotropic clusters. Given a catalog of seismic events, the output is the optimal set of plane segments that fits the spatial structure of the data. Each plane segment is fully characterized by its location, size and orientation. The main tunable parameter is the accuracy of the earthquake localizations, which fixes the resolution, i.e. the residual variance of the fit. The resolution determines the number of fault segments needed to describe the earthquake catalog, the better the resolution, the finer the structure of the reconstructed fault segments. The algorithm reconstructs successfully the fault segments of synthetic earthquake catalogs. Applied to the real catalog constituted of a subset of the aftershocks sequence of the 28th June 1992 Landers earthquake in Southern California, the reconstructed plane segments fully agree with faults already known on geological maps, or with blind faults that appear quite obvious on longer-term catalogs. Future improvements of the method are discussed, as well as its potential use in the multi-scale study of the inner structure of fault zones

    Multifractal Omori law for earthquake triggering: new tests on the California, Japan and worldwide catalogues

    Get PDF
    The Multifractal Stress-Activated model is a statistical model of triggered seismicity based on mechanical and thermodynamic principles. It predicts that, above a triggering magnitude cut-off M0, the exponent p of the Omori law for the time decay of the rate of aftershocks is a linear increasing function p(M) = a0M+b0 of the main shock magnitude M. We previously reported empirical support for this prediction, using the Southern California Earthquake Center (SCEC) catalogue. Here, we confirm this observation using an updated, longer version of the same catalogue, as well as new methods to estimate p. One of this methods is the newly defined Scaling Function Analysis (SFA), adapted from the wavelet transform. This method is able to measure a mathematical singularity (hence a p-value), erasing the possible regular part of a time-series. The SFA also proves particularly efficient to reveal the coexistence and superposition of several types of relaxation laws (typical Omori sequences and short-lived swarms sequences) which can be mixed within the same catalogue. Another new method consists in monitoring the largest aftershock magnitude observed in successive time intervals, and thus shortcuts the problem of missing events with small magnitudes in aftershock catalogues. The same methods are used on data from the worldwide Harvard Centroid Moment Tensor (CMT) catalogue and show results compatible with those of Southern California. For the Japan Meteorological Agency (JMA) catalogue, we still observe a linear dependence of p on M, but with a smaller slope. The SFA shows however that results for this catalogue may be biased by numerous swarm sequences, despite our efforts to remove them before the analysi

    Segmentation of Fault Networks Determined from Spatial Clustering of Earthquakes

    Full text link
    We present a new method of data clustering applied to earthquake catalogs, with the goal of reconstructing the seismically active part of fault networks. We first use an original method to separate clustered events from uncorrelated seismicity using the distribution of volumes of tetrahedra defined by closest neighbor events in the original and randomized seismic catalogs. The spatial disorder of the complex geometry of fault networks is then taken into account by defining faults as probabilistic anisotropic kernels, whose structures are motivated by properties of discontinuous tectonic deformation and previous empirical observations of the geometry of faults and of earthquake clusters at many spatial and temporal scales. Combining this a priori knowledge with information theoretical arguments, we propose the Gaussian mixture approach implemented in an Expectation-Maximization (EM) procedure. A cross-validation scheme is then used and allows the determination of the number of kernels that should be used to provide an optimal data clustering of the catalog. This three-steps approach is applied to a high quality relocated catalog of the seismicity following the 1986 Mount Lewis (Ml=5.7M_l=5.7) event in California and reveals that events cluster along planar patches of about 2 km2^2, i.e. comparable to the size of the main event. The finite thickness of those clusters (about 290 m) suggests that events do not occur on well-defined euclidean fault core surfaces, but rather that the damage zone surrounding faults may be seismically active at depth. Finally, we propose a connection between our methodology and multi-scale spatial analysis, based on the derivation of spatial fractal dimension of about 1.8 for the set of hypocenters in the Mnt Lewis area, consistent with recent observations on relocated catalogs

    Multifractal Scaling of Thermally-Activated Rupture Processes

    Full text link
    We propose a ``multifractal stress activation'' model combining thermally activated rupture and long memory stress relaxation, which predicts that seismic decay rates after mainshocks follow the Omori law 1/tp\sim 1/t^p with exponents pp linearly increasing with the magnitude MLM_L of the mainshock and the inverse temperature. We carefully test this prediction on earthquake sequences in the Southern California Earthquake catalog: we find power law relaxations of seismic sequences triggered by mainshocks with exponents pp increasing with the mainshock magnitude by approximately 0.10.150.1-0.15 for each magnitude unit increase, from p(ML=3)0.6p(M_L=3) \approx 0.6 to p(ML=7)1.1p(M_L=7) \approx 1.1, in good agreement with the prediction of the multifractal model.Comment: four pages and 2 figure

    Organisation of joints and faults from 1-cm to 100-km scales revealed by optimized anisotropic wavelet coefficient method and multifractal analysis

    Get PDF
    International audienceThe classical method of statistical physics deduces the macroscopic behaviour of a system from the organization and interactions of its microscopical constituents. This kind of problem can often be solved using procedures deduced from the Renormalization Group Theory, but in some cases, the basic microscopic rail are unknown and one has to deal only with the intrinsic geometry. The wavelet analysis concept appears to be particularly adapted to this kind of situation as it highlights details of a set at a given analyzed scale. As fractures and faults generally define highly anisotropic fields, we defined a new renormalization procedure based on the use of anisotropic wavelets. This approach consists of finding an optimum filter will maximizes wavelet coefficients at each point of the fie] Its intrinsic definition allows us to compute a rose diagram of the main structural directions present in t field at every scale. Scaling properties are determine using a multifractal box-counting analysis improved take account of samples with irregular geometry and finite size. In addition, we present histograms of fault length distribution. Our main observation is that different geometries and scaling laws hold for different rang of scales, separated by boundaries that correlate well with thicknesses of lithological units that constitute the continental crust. At scales involving the deformation of the crystalline crust, we find that faulting displays some singularities similar to those commonly observed in Diffusion- Limited Aggregation processes
    corecore