153 research outputs found
Gambling scores in earthquake prediction analysis
The number of successes 'n' and the normalized measure of space-time alarm
'tau' are commonly used to characterize the strength of an earthquake
prediction method and the significance of prediction results. To evaluate
better the forecaster's skill, it has been recently suggested to use a new
characteristic, the gambling score R, which incorporates the difficulty of
guessing each target event by using different weights for different alarms. We
expand the class of R-characteristics and apply these to the analysis of
results of the M8 prediction algorithm. We show that the level of significance
'alfa' strongly depends (1) on the choice of weighting alarm parameters, (2) on
the partitioning of the entire alarm volume into component parts, and (3) on
the accuracy of the spatial rate of target events, m(dg). These tools are at
the disposal of the researcher and can affect the significance estimate in
either direction. All the R-statistics discussed here corroborate that the
prediction of 8.0<=M<8.5 events by the M8 method is nontrivial. However,
conclusions based on traditional characteristics (n,tau) are more reliable
owing to two circumstances: 'tau' is stable since it is based on relative
values of m(.), and the 'n' statistic enables constructing an upper estimate of
'alfa' taking into account the uncertainty of m(.).Comment: 17 pages, 3 fugure
On Operational Earthquake Forecast and Prediction Problems.
In his SSA presidential address (Jordan, 2014), and later in a more extended publication with coauthors (Jordan et al., 2014), Jordan presents a vision of forecast and prediction problems of earthquake system science. As experienced practitioners and in full appreciation of scientific studies on earthquake forecasting, we find it necessary to share a complementary viewpoint
Predictability in the ETAS Model of Interacting Triggered Seismicity
As part of an effort to develop a systematic methodology for earthquake
forecasting, we use a simple model of seismicity based on interacting events
which may trigger a cascade of earthquakes, known as the Epidemic-Type
Aftershock Sequence model (ETAS). The ETAS model is constructed on a bare
(unrenormalized) Omori law, the Gutenberg-Richter law and the idea that large
events trigger more numerous aftershocks. For simplicity, we do not use the
information on the spatial location of earthquakes and work only in the time
domain. We offer an analytical approach to account for the yet unobserved
triggered seismicity adapted to the problem of forecasting future seismic rates
at varying horizons from the present. Tests presented on synthetic catalogs
validate strongly the importance of taking into account all the cascades of
still unobserved triggered events in order to predict correctly the future
level of seismicity beyond a few minutes. We find a strong predictability if
one accepts to predict only a small fraction of the large-magnitude targets.
However, the probability gains degrade fast when one attempts to predict a
larger fraction of the targets. This is because a significant fraction of
events remain uncorrelated from past seismicity. This delineates the
fundamental limits underlying forecasting skills, stemming from an intrinsic
stochastic component in these interacting triggered seismicity models.Comment: Latex file of 20 pages + 15 eps figures + 2 tables, in press in J.
Geophys. Re
Predictability of Volcano Eruption: lessons from a basaltic effusive volcano
Volcano eruption forecast remains a challenging and controversial problem
despite the fact that data from volcano monitoring significantly increased in
quantity and quality during the last decades.This study uses pattern
recognition techniques to quantify the predictability of the 15 Piton de la
Fournaise (PdlF) eruptions in the 1988-2001 period using increase of the daily
seismicity rate as a precursor. Lead time of this prediction is a few days to
weeks. Using the daily seismicity rate, we formulate a simple prediction rule,
use it for retrospective prediction of the 15 eruptions,and test the prediction
quality with error diagrams. The best prediction performance corresponds to
averaging the daily seismicity rate over 5 days and issuing a prediction alarm
for 5 days. 65% of the eruptions are predicted for an alarm duration less than
20% of the time considered. Even though this result is concomitant of a large
number of false alarms, it is obtained with a crude counting of daily events
that are available from most volcano observatoriesComment: 4 pages, 4 figure
On the tilt of the Earth's polar axis (climat): Some 'impressionist' remarks
In this lengthy letter, we wanted to discuss the concept of climate based on
definitions established for over a century and direct observations that we have
been collecting for more than a century as well. To do this, we present and
discuss the remarkably stable maps over time of the various physical parameters
that make up the climate corpus: solar temperature, atmospheric pressure,
winds, precipitation, temperature anomalies. This impressionistic tableau that
we are gradually sketching as our reflection unfolds leads us to the following
proposition: What if, as Laplace first proposed in 1799 and later
Milankovi\{'}c in 1920, ground temperature were merely a consequence of climate
and not a separate parameter of climate in its own right?Comment: 24 pages, 11 Figure
Reverse Detection of Short-Term Earthquake Precursors
We introduce a new approach to short-term earthquake prediction based on the
concept of selforganization of seismically active fault networks. That approach
is named "Reverse Detection of Precursors" (RDP), since it considers precursors
in reverse order of their appearance. This makes it possible to detect
precursors undetectable by direct analysis. Possible mechanisms underlying RDP
are outlined. RDP is described with a concrete example: we consider as
short-term precursors the newly introduced chains of earthquakes reflecting the
rise of an earthquake correlation range; and detect (retrospectively) such
chains a few months before two prominent Californian earthquakes - Landers,
1992, M = 7.6, and Hector Mine, 1999, M = 7.3, with one false alarm. Similar
results (described elsewhere) are obtained by RDP for 21 more strong
earthquakes in California (M >= 6.4), Japan (M >= 7.0) and the Eastern
Mediterranean (M >= 6.5). Validation of the RDP approach requires, as always,
prediction in advance for which this study sets up a base. We have the first
case of advance prediction; it was reported before Tokachi-oki earthquake (near
Hokkaido island, Japan), Sept. 25, 2003, M = 8.1. RDP has potentially important
applications to other precursors and to prediction of other critical phenomena
besides earthquakes. In particular, it might vindicate some short-term
precursors, previously rejected as giving too many false alarms.Comment: 17 pages, 5 figure
On self-similarity of premonitory patterns in the regions of natural and induced seismicity
Anticipating the scale invariance of rock fracturing processes, we applied Keilis-Borok's algorithm M8, originally designed for identifying times of increased probability (TIPS) of occurrence of strong earthquakes (M < 8.0), retrospectively to Koyna earthquakes which occurred in the region after the impoundment of the Shivaji Sagar reservoir in 1962. The algorithm which enables diagnosis of TIPS from the 7th year onwards after the commencement of the earliest available data set showed that the 5.3 magnitude earthquake of 20 September 1980 indeed occurred within a time of increased probability. This result, apart from its potential application to recognizing future TIPS in the region, points to selfsimilarity between the premonitory patterns of natural and induced earthquakes and to scale-invariant nature of their processes. Further, a typical precursory rise in seismicity followed by a relative quiescence was also found to precede all the three larger earthquakes of the sequence
- …