852 research outputs found

    General theory of the modified Gutenberg-Richter law for large seismic moments

    Full text link
    The Gutenberg-Richter power law distribution of earthquake sizes is one of the most famous example illustrating self-similarity. It is well-known that the Gutenberg-Richter distribution has to be modified for large seismic moments, due to energy conservation and geometrical reasons. Several models have been proposed, either in terms of a second power law with a larger b-value beyond a cross-over magnitude, or based on a ``hard'' magnitude cut-off or a ``soft'' magnitude cut-off using an exponential taper. Since the large scale tectonic deformation is dominated by the very largest earthquakes and since their impact on loss of life and properties is huge, it is of great importance to constrain as much as possible the shape of their distribution. We present a simple and powerful probabilistic theoretical approach that shows that the Gamma distribution is the best model, under the two hypothesis that the Gutenberg-Richter power law distribution holds in absence of any condition (condition of criticality) and that one or several constraints are imposed, either based on conservation laws or on the nature of the observations themselves. The selection of the Gamma distribution does not depend on the specific nature of the constraint. We illustrate the approach with two constraints, the existence of a finite moment release rate and the observation of the size of a maximum earthquake in a finite catalog. Our predicted ``soft'' maximum magnitudes compare favorably with those obtained by Kagan [1997] for the Flinn-Engdahl regionalization of subduction zones, collision zones and mid-ocean ridges.Comment: 24 pages, including 3 tables, in press in Bull. Seism. Soc. A

    Acoustic fluidization for earthquakes?

    Full text link
    Melosh [1996] has suggested that acoustic fluidization could provide an alternative to theories that are invoked as explanations for why some crustal faults appear to be weak. We show that there is a subtle but profound inconsistency in the theory that unfortunately invalidates the results. We propose possible remedies but must acknowledge that the relevance of acoustic fluidization remains an open question.Comment: 13 page

    Significance of log-periodic precursors to financial crashes

    Full text link
    We clarify the status of log-periodicity associated with speculative bubbles preceding financial crashes. In particular, we address Feigenbaum's [2001] criticism and show how it can be rebuked. Feigenbaum's main result is as follows: ``the hypothesis that the log-periodic component is present in the data cannot be rejected at the 95% confidence level when using all the data prior to the 1987 crash; however, it can be rejected by removing the last year of data.'' (e.g., by removing 15% of the data closest to the critical point). We stress that it is naive to analyze a critical point phenomenon, i.e., a power law divergence, reliably by removing the most important part of the data closest to the critical point. We also present the history of log-periodicity in the present context explaining its essential features and why it may be important. We offer an extension of the rational expectation bubble model for general and arbitrary risk-aversion within the general stochastic discount factor theory. We suggest guidelines for using log-periodicity and explain how to develop and interpret statistical tests of log-periodicity. We discuss the issue of prediction based on our results and the evidence of outliers in the distribution of drawdowns. New statistical tests demonstrate that the 1% to 10% quantile of the largest events of the population of drawdowns of the Nasdaq composite index and of the Dow Jones Industrial Average index belong to a distribution significantly different from the rest of the population. This suggests that very large drawdowns result from an amplification mechanism that may make them more predictable than smaller market moves.Comment: Latex document of 38 pages including 16 eps figures and 3 tables, in press in Quantitative Financ

    Stock market crashes are outliers

    Full text link
    We call attention against what seems to a widely held misconception according to which large crashes are the largest events of distributions of price variations with fat tails. We demonstrate on the Dow Jones Industrial index that with high probability the three largest crashes in this century are outliers. This result supports suggestion that large crashes result from specific amplification processes that might lead to observable pre-cursory signatures.Comment: 8 pages, 3 figures (accepted in European Physical Journal B

    Effects of Diversity and Procrastination in Priority Queuing Theory: the Different Power Law Regimes

    Full text link
    Empirical analysis show that, after the update of a browser, the publication of the vulnerability of a software, or the discovery of a cyber worm, the fraction of computers still using the older version, or being not yet patched, or exhibiting worm activity decays as power laws ∼1/tα\sim 1/t^{\alpha} with 0<α≤10 < \alpha \leq 1 over time scales of years. We present a simple model for this persistence phenomenon framed within the standard priority queuing theory, of a target task which has the lowest priority compared with all other tasks that flow on the computer of an individual. We identify a "time deficit" control parameter β\beta and a bifurcation to a regime where there is a non-zero probability for the target task to never be completed. The distribution of waiting time T{\cal T} till the completion of the target task has the power law tail ∼1/t1/2\sim 1/t^{1/2}, resulting from a first-passage solution of an equivalent Wiener process. Taking into account a diversity of time deficit parameters in a population of individuals, the power law tail is changed into 1/tα1/t^\alpha with α∈(0.5,∞)\alpha\in(0.5,\infty), including the well-known case 1/t1/t. We also study the effect of "procrastination", defined as the situation in which the target task may be postponed or delayed even after the individual has solved all other pending tasks. This new regime provides an explanation for even slower apparent decay and longer persistence.Comment: 32 pages, 10 figure

    The log-periodic-AR(1)-GARCH(1,1) model for financial crashes

    Full text link
    This paper intends to meet recent claims for the attainment of more rigorous statistical methodology within the econophysics literature. To this end, we consider an econometric approach to investigate the outcomes of the log-periodic model of price movements, which has been largely used to forecast financial crashes. In order to accomplish reliable statistical inference for unknown parameters, we incorporate an autoregressive dynamic and a conditional heteroskedasticity structure in the error term of the original model, yielding the log-periodic-AR(1)-GARCH(1,1) model. Both the original and the extended models are fitted to financial indices of U. S. market, namely S&P500 and NASDAQ. Our analysis reveal two main points: (i) the log-periodic-AR(1)-GARCH(1,1) model has residuals with better statistical properties and (ii) the estimation of the parameter concerning the time of the financial crash has been improved.Comment: 17 pages, 4 figures, 12 tables, to appear in Europen Physical Journal
    • …
    corecore