1,647 research outputs found
Reverse Detection of Short-Term Earthquake Precursors
We introduce a new approach to short-term earthquake prediction based on the
concept of selforganization of seismically active fault networks. That approach
is named "Reverse Detection of Precursors" (RDP), since it considers precursors
in reverse order of their appearance. This makes it possible to detect
precursors undetectable by direct analysis. Possible mechanisms underlying RDP
are outlined. RDP is described with a concrete example: we consider as
short-term precursors the newly introduced chains of earthquakes reflecting the
rise of an earthquake correlation range; and detect (retrospectively) such
chains a few months before two prominent Californian earthquakes - Landers,
1992, M = 7.6, and Hector Mine, 1999, M = 7.3, with one false alarm. Similar
results (described elsewhere) are obtained by RDP for 21 more strong
earthquakes in California (M >= 6.4), Japan (M >= 7.0) and the Eastern
Mediterranean (M >= 6.5). Validation of the RDP approach requires, as always,
prediction in advance for which this study sets up a base. We have the first
case of advance prediction; it was reported before Tokachi-oki earthquake (near
Hokkaido island, Japan), Sept. 25, 2003, M = 8.1. RDP has potentially important
applications to other precursors and to prediction of other critical phenomena
besides earthquakes. In particular, it might vindicate some short-term
precursors, previously rejected as giving too many false alarms.Comment: 17 pages, 5 figure
Predictability of extreme events in a branching diffusion model
We propose a framework for studying predictability of extreme events in
complex systems. Major conceptual elements -- hierarchical structure, spatial
dynamics, and external driving -- are combined in a classical branching
diffusion with immigration. New elements -- observation space and observed
events -- are introduced in order to formulate a prediction problem patterned
after the geophysical and environmental applications. The problem consists of
estimating the likelihood of occurrence of an extreme event given the
observations of smaller events while the complete internal dynamics of the
system is unknown. We look for premonitory patterns that emerge as an extreme
event approaches; those patterns are deviations from the long-term system's
averages. We have found a single control parameter that governs multiple
spatio-temporal premonitory patterns. For that purpose, we derive i) complete
analytic description of time- and space-dependent size distribution of
particles generated by a single immigrant; ii) the steady-state moments that
correspond to multiple immigrants; and iii) size- and space-based asymptotic
for the particle size distribution. Our results suggest a mechanism for
universal premonitory patterns and provide a natural framework for their
theoretical and empirical study
Evaluation of salmon and steelhead spawning habitat quality in the South Fork Trinity River Basin, 1997
Sediment sampling was used to evaluate chinook salmon
(Oncorhynchus tshawytscha) and steelhead (O. mykiss) spawning habitat quality in the South Fork Trinity River (SFTR) basin. Sediment samples were collected using a McNeil-type sampler and wet sieved through a series of Tyler screens (25.00 mm, 12.50 mm, 6.30 mm, 3.35 mm, 1.00 mm, and 0.85 mm). Fines (particles < 0.85 mm) were determined after a l0-minute settling period in Imhoff cones. Thirteen stations were sampled in the SFTR basin: five stations were located in mainstem SFTR between rk 2.1 and 118.5, 2 stations each were located in EF of the SFTR, Grouse Creek, and Madden Creek, and one station each was located in Eltapom and Hayfork Creeks. Sample means for fines(particles < 0.85 mm) fer SFTR stations ranged between
14.4 and 19.4%; tributary station sample mean fines ranged between 3.4 and 19.4%. Decreased egg survival would be expected at 4 of 5 mainstem SFTR stations and at one station in EF of SFTR and Grouse Creek where fines content exceed 15%. Small gravel/sand content measured at all stations were high, and exceed levels associated with reduced sac fry emergence rates. Reduction of egg survival or sac fry emergence due to sedimentation in spawning gravels could lead to reduced juvenile production from the South Fork Trinity River.
(PDF contains 18 pages.
The importance of detailed epigenomic profiling of different cell types within organs.
The human body consists of hundreds of kinds of cells specified from a single genome overlaid with cell type-specific epigenetic information. Comprehensively profiling the body's distinct epigenetic landscapes will allow researchers to verify cell types used in regenerative medicine and to determine the epigenetic effects of disease, environmental exposures and genetic variation. Key marks/factors that should be investigated include regions of nucleosome-free DNA accessible to regulatory factors, histone marks defining active enhancers and promoters, DNA methylation levels, regulatory RNAs, and factors controlling the three-dimensional conformation of the genome. Here we use the lung to illustrate the importance of investigating an organ's purified cell epigenomes, and outline the challenges and promise of realizing a comprehensive catalog of primary cell epigenomes
Watching gene expression in color.
A combination of two fluorescent proteins with different half-lives allows gene expression to be followed with improved time resolution
Predictability in the ETAS Model of Interacting Triggered Seismicity
As part of an effort to develop a systematic methodology for earthquake
forecasting, we use a simple model of seismicity based on interacting events
which may trigger a cascade of earthquakes, known as the Epidemic-Type
Aftershock Sequence model (ETAS). The ETAS model is constructed on a bare
(unrenormalized) Omori law, the Gutenberg-Richter law and the idea that large
events trigger more numerous aftershocks. For simplicity, we do not use the
information on the spatial location of earthquakes and work only in the time
domain. We offer an analytical approach to account for the yet unobserved
triggered seismicity adapted to the problem of forecasting future seismic rates
at varying horizons from the present. Tests presented on synthetic catalogs
validate strongly the importance of taking into account all the cascades of
still unobserved triggered events in order to predict correctly the future
level of seismicity beyond a few minutes. We find a strong predictability if
one accepts to predict only a small fraction of the large-magnitude targets.
However, the probability gains degrade fast when one attempts to predict a
larger fraction of the targets. This is because a significant fraction of
events remain uncorrelated from past seismicity. This delineates the
fundamental limits underlying forecasting skills, stemming from an intrinsic
stochastic component in these interacting triggered seismicity models.Comment: Latex file of 20 pages + 15 eps figures + 2 tables, in press in J.
Geophys. Re
Predicting Failure using Conditioning on Damage History: Demonstration on Percolation and Hierarchical Fiber Bundles
We formulate the problem of probabilistic predictions of global failure in
the simplest possible model based on site percolation and on one of the
simplest model of time-dependent rupture, a hierarchical fiber bundle model. We
show that conditioning the predictions on the knowledge of the current degree
of damage (occupancy density or number and size of cracks) and on some
information on the largest cluster improves significantly the prediction
accuracy, in particular by allowing to identify those realizations which have
anomalously low or large clusters (cracks). We quantify the prediction gains
using two measures, the relative specific information gain (which is the
variation of entropy obtained by adding new information) and the
root-mean-square of the prediction errors over a large ensemble of
realizations. The bulk of our simulations have been obtained with the
two-dimensional site percolation model on a lattice of size and hold true for other lattice sizes. For the hierarchical fiber
bundle model, conditioning the measures of damage on the information of the
location and size of the largest crack extends significantly the critical
region and the prediction skills. These examples illustrate how on-going damage
can be used as a revelation of both the realization-dependent pre-existing
heterogeneity and the damage scenario undertaken by each specific sample.Comment: 7 pages + 11 figure
Prediction of Large Events on a Dynamical Model of a Fault
We present results for long term and intermediate term prediction algorithms
applied to a simple mechanical model of a fault. We use long term prediction
methods based, for example, on the distribution of repeat times between large
events to establish a benchmark for predictability in the model. In comparison,
intermediate term prediction techniques, analogous to the pattern recognition
algorithms CN and M8 introduced and studied by Keilis-Borok et al., are more
effective at predicting coming large events. We consider the implications of
several different quality functions Q which can be used to optimize the
algorithms with respect to features such as space, time, and magnitude windows,
and find that our results are not overly sensitive to variations in these
algorithm parameters. We also study the intrinsic uncertainties associated with
seismicity catalogs of restricted lengths.Comment: 33 pages, plain.tex with special macros include
- …
