6,598 research outputs found
Statistical Mechanics and Information-Theoretic Perspectives on Complexity in the Earth System
Peer reviewedPublisher PD
A Nonextensive Statistical Physics Analysis of the 1995 Kobe, Japan Earthquake
This paper presents an analysis of the distribution of earthquake magnitudes for the period 1990–1998 in a broad area surrounding the epicenter of the 1995 Kobe earthquake. The frequency–magnitude distribution analysis is performed in a nonextensive statistical physics context. The nonextensive parameter q M , which is related to the frequency-magnitude distribution, reflects the existence of long-range correlations and is used as an index of the physical state of the studied area. Examination of the possible variations of q M values is performed during the period 1990–1998. A significant increase of q M occurs some months before the strong earthquake on April 9, 1994 indicating the start of a preparation phase prior to the Kobe earthquake. It should be noted that this increase coincides with the occurrence of six seismic events. Each of these events had a magnitude M = 4.1. The evolution of seismicity along with the increase of q M indicate the system’s transition away from equilibrium and its preparation for energy release. It seems that the variations of q M values reflect rather well the physical evolution towards the 1995 Kobe earthquake
Unfolding the procedure of characterizing recorded ultra low frequency, kHZ and MHz electromagetic anomalies prior to the L'Aquila earthquake as pre-seismic ones. Part I
Ultra low frequency, kHz and MHz electromagnetic anomalies were recorded
prior to the L'Aquila catastrophic earthquake that occurred on April 6, 2009.
The main aims of this contribution are: (i) To suggest a procedure for the
designation of detected EM anomalies as seismogenic ones. We do not expect to
be possible to provide a succinct and solid definition of a pre-seismic EM
emission. Instead, we attempt, through a multidisciplinary analysis, to provide
elements of a definition. (ii) To link the detected MHz and kHz EM anomalies
with equivalent last stages of the L'Aquila earthquake preparation process.
(iii) To put forward physically meaningful arguments to support a way of
quantifying the time to global failure and the identification of distinguishing
features beyond which the evolution towards global failure becomes
irreversible. The whole effort is unfolded in two consecutive parts. We clarify
we try to specify not only whether or not a single EM anomaly is pre-seismic in
itself, but mainly whether a combination of kHz, MHz, and ULF EM anomalies can
be characterized as pre-seismic one
Colloquium: Physical approaches to DNA sequencing and detection
With the continued improvement of sequencing technologies, the prospect of genome-based medicine is now at the forefront of scientific research. To realize this potential, however, a revolutionary sequencing method is needed for the cost-effective and rapid interrogation of individual genomes. This capability is likely to be provided by a physical approach to probing DNA at the single-nucleotide level. This is in sharp contrast to current techniques and instruments that probe (through chemical elongation, electrophoresis, and optical detection) length differences and terminating bases of strands of DNA. Several physical approaches to DNA detection have the potential to deliver fast and low-cost sequencing. Central to these approaches is the concept of nanochannels or nanopores, which allow for the spatial confinement of DNA molecules. In addition to their possible impact in medicine and biology, the methods offer ideal test beds to study open scientific issues and challenges in the relatively unexplored area at the interface between solids, liquids, and biomolecules at the nanometer length scale. This Colloquium emphasizes the physics behind these methods and ideas, critically describes their advantages and drawbacks, and discusses future research opportunities in the field
Beyond the Shannon-Khinchin Formulation: The Composability Axiom and the Universal Group Entropy
The notion of entropy is ubiquitous both in natural and social sciences. In
the last two decades, a considerable effort has been devoted to the study of
new entropic forms, which generalize the standard Boltzmann-Gibbs (BG) entropy
and are widely applicable in thermodynamics, quantum mechanics and information
theory. In [23], by extending previous ideas of Shannon [38], [39], Khinchin
proposed an axiomatic definition of the BG entropy, based on four requirements,
nowadays known as the Shannon-Khinchin (SK) axioms.
The purpose of this paper is twofold. First, we show that there exists an
intrinsic group-theoretical structure behind the notion of entropy. It comes
from the requirement of composability of an entropy with respect to the union
of two statistically independent subsystems, that we propose in an axiomatic
formulation. Second, we show that there exists a simple universal class of
admissible entropies. This class contains many well known examples of entropies
and infinitely many new ones, a priori multi-parametric. Due to its specific
relation with the universal formal group, the new family of entropies
introduced in this work will be called the universal-group entropy. A new
example of multi-parametric entropy is explicitly constructed.Comment: Extended version; 25 page
- …