316 research outputs found
Structural Properties of Self-Attracting Walks
Self-attracting walks (SATW) with attractive interaction u > 0 display a
swelling-collapse transition at a critical u_{\mathrm{c}} for dimensions d >=
2, analogous to the \Theta transition of polymers. We are interested in the
structure of the clusters generated by SATW below u_{\mathrm{c}} (swollen
walk), above u_{\mathrm{c}} (collapsed walk), and at u_{\mathrm{c}}, which can
be characterized by the fractal dimensions of the clusters d_{\mathrm{f}} and
their interface d_{\mathrm{I}}. Using scaling arguments and Monte Carlo
simulations, we find that for u<u_{\mathrm{c}}, the structures are in the
universality class of clusters generated by simple random walks. For
u>u_{\mathrm{c}}, the clusters are compact, i.e. d_{\mathrm{f}}=d and
d_{\mathrm{I}}=d-1. At u_{\mathrm{c}}, the SATW is in a new universality class.
The clusters are compact in both d=2 and d=3, but their interface is fractal:
d_{\mathrm{I}}=1.50\pm0.01 and 2.73\pm0.03 in d=2 and d=3, respectively. In
d=1, where the walk is collapsed for all u and no swelling-collapse transition
exists, we derive analytical expressions for the average number of visited
sites and the mean time to visit S sites.Comment: 15 pages, 8 postscript figures, submitted to Phys. Rev.
Beyond Blobs in Percolation Cluster Structure: The Distribution of 3-Blocks at the Percolation Threshold
The incipient infinite cluster appearing at the bond percolation threshold
can be decomposed into singly-connected ``links'' and multiply-connected
``blobs.'' Here we decompose blobs into objects known in graph theory as
3-blocks. A 3-block is a graph that cannot be separated into disconnected
subgraphs by cutting the graph at 2 or fewer vertices. Clusters, blobs, and
3-blocks are special cases of -blocks with , 2, and 3, respectively. We
study bond percolation clusters at the percolation threshold on 2-dimensional
square lattices and 3-dimensional cubic lattices and, using Monte-Carlo
simulations, determine the distribution of the sizes of the 3-blocks into which
the blobs are decomposed. We find that the 3-blocks have fractal dimension
in 2D and in 3D. These fractal dimensions are
significantly smaller than the fractal dimensions of the blobs, making possible
more efficient calculation of percolation properties. Additionally, the
closeness of the estimated values for in 2D and 3D is consistent with the
possibility that is dimension independent. Generalizing the concept of
the backbone, we introduce the concept of a ``-bone'', which is the set of
all points in a percolation system connected to disjoint terminal points
(or sets of disjoint terminal points) by disjoint paths. We argue that the
fractal dimension of a -bone is equal to the fractal dimension of
-blocks, allowing us to discuss the relation between the fractal dimension
of -blocks and recent work on path crossing probabilities.Comment: All but first 2 figs. are low resolution and are best viewed when
printe
Heuristic Segmentation of a Nonstationary Time Series
Many phenomena, both natural and human-influenced, give rise to signals whose
statistical properties change under time translation, i.e., are nonstationary.
For some practical purposes, a nonstationary time series can be seen as a
concatenation of stationary segments. Using a segmentation algorithm, it has
been reported that for heart beat data and Internet traffic fluctuations--the
distribution of durations of these stationary segments decays with a power law
tail. A potential technical difficulty that has not been thoroughly
investigated is that a nonstationary time series with a (scale-free) power law
distribution of stationary segments is harder to segment than other
nonstationary time series because of the wider range of possible segment sizes.
Here, we investigate the validity of a heuristic segmentation algorithm
recently proposed by Bernaola-Galvan et al. by systematically analyzing
surrogate time series with different statistical properties. We find that if a
given nonstationary time series has stationary periods whose size is
distributed as a power law, the algorithm can split the time series into a set
of stationary segments with the correct statistical properties. We also find
that the estimated power law exponent of the distribution of stationary-segment
sizes is affected by (i) the minimum segment size, and (ii) the ratio of the
standard deviation of the mean values of the segments, and the standard
deviation of the fluctuations within a segment. Furthermore, we determine that
the performance of the algorithm is generally not affected by uncorrelated
noise spikes or by weak long-range temporal correlations of the fluctuations
within segments.Comment: 23 pages, 14 figure
Spurious trend switching phenomena in financial markets
The observation of power laws in the time to extrema of volatility, volume
and intertrade times, from milliseconds to years, are shown to result
straightforwardly from the selection of biased statistical subsets of
realizations in otherwise featureless processes such as random walks. The bias
stems from the selection of price peaks that imposes a condition on the
statistics of price change and of trade volumes that skew their distributions.
For the intertrade times, the extrema and power laws results from the format of
transaction data
Statistical mechanics in the context of special relativity
In the present effort we show that is the unique existing entropy obtained
by a continuous deformation of the Shannon-Boltzmann entropy and preserving unaltered its fundamental properties of concavity,
additivity and extensivity. Subsequently, we explain the origin of the
deformation mechanism introduced by and show that this deformation
emerges naturally within the Einstein special relativity. Furthermore, we
extend the theory in order to treat statistical systems in a time dependent and
relativistic context. Then, we show that it is possible to determine in a self
consistent scheme within the special relativity the values of the free
parameter which results to depend on the light speed and reduces
to zero as recovering in this way the ordinary statistical
mechanics and thermodynamics. The novel statistical mechanics constructed
starting from the above entropy, preserves unaltered the mathematical and
epistemological structure of the ordinary statistical mechanics and is suitable
to describe a very large class of experimentally observed phenomena in low and
high energy physics and in natural, economic and social sciences. Finally, in
order to test the correctness and predictability of the theory, as working
example we consider the cosmic rays spectrum, which spans 13 decades in energy
and 33 decades in flux, finding a high quality agreement between our
predictions and observed data.
PACS number(s): 05.20.-y, 51.10.+y, 03.30.+p, 02.20.-aComment: 17 pages (two columns), 5 figures, RevTeX4, minor typing correction
Symbolic stochastic dynamical systems viewed as binary N-step Markov chains
A theory of systems with long-range correlations based on the consideration
of binary N-step Markov chains is developed. In the model, the conditional
probability that the i-th symbol in the chain equals zero (or unity) is a
linear function of the number of unities among the preceding N symbols. The
correlation and distribution functions as well as the variance of number of
symbols in the words of arbitrary length L are obtained analytically and
numerically. A self-similarity of the studied stochastic process is revealed
and the similarity group transformation of the chain parameters is presented.
The diffusion Fokker-Planck equation governing the distribution function of the
L-words is explored. If the persistent correlations are not extremely strong,
the distribution function is shown to be the Gaussian with the variance being
nonlinearly dependent on L. The applicability of the developed theory to the
coarse-grained written and DNA texts is discussed.Comment: 14 pages, 13 figure
Scaling detection in time series: diffusion entropy analysis
The methods currently used to determine the scaling exponent of a complex
dynamic process described by a time series are based on the numerical
evaluation of variance. This means that all of them can be safely applied only
to the case where ordinary statistical properties hold true even if strange
kinetics are involved. We illustrate a method of statistical analysis based on
the Shannon entropy of the diffusion process generated by the time series,
called Diffusion Entropy Analysis (DEA). We adopt artificial Gauss and L\'{e}vy
time series, as prototypes of ordinary and anomalus statistics, respectively,
and we analyse them with the DEA and four ordinary methods of analysis, some of
which are very popular. We show that the DEA determines the correct scaling
exponent even when the statistical properties, as well as the dynamic
properties, are anomalous. The other four methods produce correct results in
the Gauss case but fail to detect the correct scaling in the case of L\'{e}vy
statistics.Comment: 21 pages,10 figures, 1 tabl
Frequency-dependent (ac) Conduction in Disordered Composites: a Percolative Study
In a recent paper [Phys. Rev. B{\bf57}, 3375 (1998)], we examined in detail
the nonlinear (electrical) dc response of a random resistor cum tunneling bond
network (, introduced by us elsewhere to explain nonlinear response of
metal-insulator type mixtures). In this work which is a sequel to that paper,
we consider the ac response of the -based correlated () model.
Numerical solutions of the Kirchoff's laws for the model give a power-law
exponent (= 0.7 near ) of the modulus of the complex ac conductance at
moderately low frequencies, in conformity with experiments on various types of
disordered systems. But, at very low frequencies, it gives a simple quadratic
or linear dependence on the frequency depending upon whether the system is
percolating or not. We do also discuss the effective medium approximation
() of our and the traditional random network model, and discuss
their comparative successes and shortcomings.Comment: Revised and reduced version with 17 LaTeX pages plus 8 JPEG figure
Binary data corruption due to a Brownian agent
We introduce a model of binary data corruption induced by a Brownian agent
(active random walker) on a d-dimensional lattice. A continuum formulation
allows the exact calculation of several quantities related to the density of
corrupted bits \rho; for example the mean of \rho, and the density-density
correlation function. Excellent agreement is found with the results from
numerical simulations. We also calculate the probability distribution of \rho
in d=1, which is found to be log-normal, indicating that the system is governed
by extreme fluctuations.Comment: 39 pages, 10 figures, RevTe
A planar diagram approach to the correlation problem
We transpose an idea of 't Hooft from its context of Yang and Mills' theory
of strongly interacting quarks to that of strongly correlated electrons in
transition metal oxides and show that a Hubbard model of N interacting electron
species reduces, to leading orders in N, to a sum of almost planar diagrams.
The resulting generating functional and integral equations are very similar to
those of the FLEX approximation of Bickers and Scalapino. This adds the Hubbard
model at large N to the list of solvable models of strongly correlated
electrons.
PACS Numbers: 71.27.+a 71.10.-w 71.10.FdComment: revtex, 5 pages, with 3 eps figure
- …