765 research outputs found
Scaling of the distribution of fluctuations of financial market indices
We study the distribution of fluctuations over a time scale (i.e.,
the returns) of the S&P 500 index by analyzing three distinct databases.
Database (i) contains approximately 1 million records sampled at 1 min
intervals for the 13-year period 1984-1996, database (ii) contains 8686 daily
records for the 35-year period 1962-1996, and database (iii) contains 852
monthly records for the 71-year period 1926-1996. We compute the probability
distributions of returns over a time scale , where varies
approximately over a factor of 10^4 - from 1 min up to more than 1 month. We
find that the distributions for 4 days (1560 mins) are
consistent with a power-law asymptotic behavior, characterized by an exponent
, well outside the stable L\'evy regime . To
test the robustness of the S&P result, we perform a parallel analysis on two
other financial market indices. Database (iv) contains 3560 daily records of
the NIKKEI index for the 14-year period 1984-97, and database (v) contains 4649
daily records of the Hang-Seng index for the 18-year period 1980-97. We find
estimates of consistent with those describing the distribution of S&P
500 daily-returns. One possible reason for the scaling of these distributions
is the long persistence of the autocorrelation function of the volatility. For
time scales longer than days, our results are
consistent with slow convergence to Gaussian behavior.Comment: 12 pages in multicol LaTeX format with 27 postscript figures
(Submitted to PRE May 20, 1999). See
http://polymer.bu.edu/~amaral/Professional.html for more of our work on this
are
Scaling of the distribution of price fluctuations of individual companies
We present a phenomenological study of stock price fluctuations of individual
companies. We systematically analyze two different databases covering
securities from the three major US stock markets: (a) the New York Stock
Exchange, (b) the American Stock Exchange, and (c) the National Association of
Securities Dealers Automated Quotation stock market. Specifically, we consider
(i) the trades and quotes database, for which we analyze 40 million records for
1000 US companies for the 2-year period 1994--95, and (ii) the Center for
Research and Security Prices database, for which we analyze 35 million daily
records for approximately 16,000 companies in the 35-year period 1962--96. We
study the probability distribution of returns over varying time scales , where varies by a factor of ---from 5 min up to
4 years. For time scales from 5~min up to approximately 16~days, we
find that the tails of the distributions can be well described by a power-law
decay, characterized by an exponent ---well outside the
stable L\'evy regime . For time scales days, we observe results consistent with a slow
convergence to Gaussian behavior. We also analyze the role of cross
correlations between the returns of different companies and relate these
correlations to the distribution of returns for market indices.Comment: 10pages 2 column format with 11 eps figures. LaTeX file requiring
epsf, multicol,revtex. Submitted to PR
Universal and non-universal properties of cross-correlations in financial time series
We use methods of random matrix theory to analyze the cross-correlation
matrix C of price changes of the largest 1000 US stocks for the 2-year period
1994-95. We find that the statistics of most of the eigenvalues in the spectrum
of C agree with the predictions of random matrix theory, but there are
deviations for a few of the largest eigenvalues. We find that C has the
universal properties of the Gaussian orthogonal ensemble of random matrices.
Furthermore, we analyze the eigenvectors of C through their inverse
participation ratio and find eigenvectors with large inverse participation
ratios at both edges of the eigenvalue spectrum--a situation reminiscent of
results in localization theory.Comment: 14 pages, 3 figures, Revte
Effects of electromagnetic waves on the electrical properties of contacts between grains
A DC electrical current is injected through a chain of metallic beads. The
electrical resistances of each bead-bead contacts are measured. At low current,
the distribution of these resistances is large and log-normal. At high enough
current, the resistance distribution becomes sharp and Gaussian due to the
creation of microweldings between some beads. The action of nearby
electromagnetic waves (sparks) on the electrical conductivity of the chain is
also studied. The spark effect is to lower the resistance values of the more
resistive contacts, the best conductive ones remaining unaffected by the spark
production. The spark is able to induce through the chain a current enough to
create microweldings between some beads. This explains why the electrical
resistance of a granular medium is so sensitive to the electromagnetic waves
produced in its vicinity.Comment: 4 pages, 5 figure
Dynamical model and nonextensive statistical mechanics of a market index on large time windows
The shape and tails of partial distribution functions (PDF) for a financial
signal, i.e. the S&P500 and the turbulent nature of the markets are linked
through a model encompassing Tsallis nonextensive statistics and leading to
evolution equations of the Langevin and Fokker-Planck type. A model originally
proposed to describe the intermittent behavior of turbulent flows describes the
behavior of normalized log-returns for such a financial market index, for small
and large time windows, both for small and large log-returns. These turbulent
market volatility (of normalized log-returns) distributions can be sufficiently
well fitted with a -distribution. The transition between the small time
scale model of nonextensive, intermittent process and the large scale Gaussian
extensive homogeneous fluctuation picture is found to be at a 200 day
time lag. The intermittency exponent () in the framework of the
Kolmogorov log-normal model is found to be related to the scaling exponent of
the PDF moments, -thereby giving weight to the model. The large value of
points to a large number of cascades in the turbulent process. The
first Kramers-Moyal coefficient in the Fokker-Planck equation is almost equal
to zero, indicating ''no restoring force''. A comparison is made between
normalized log-returns and mere price increments.Comment: 40 pages, 14 figures; accepted for publication in Phys Rev
Common Scaling Patterns in Intertrade Times of U. S. Stocks
We analyze the sequence of time intervals between consecutive stock trades of
thirty companies representing eight sectors of the U. S. economy over a period
of four years. For all companies we find that: (i) the probability density
function of intertrade times may be fit by a Weibull distribution; (ii) when
appropriately rescaled the probability densities of all companies collapse onto
a single curve implying a universal functional form; (iii) the intertrade times
exhibit power-law correlated behavior within a trading day and a consistently
greater degree of correlation over larger time scales, in agreement with the
correlation behavior of the absolute price returns for the corresponding
company, and (iv) the magnitude series of intertrade time increments is
characterized by long-range power-law correlations suggesting the presence of
nonlinear features in the trading dynamics, while the sign series is
anti-correlated at small scales. Our results suggest that independent of
industry sector, market capitalization and average level of trading activity,
the series of intertrade times exhibit possibly universal scaling patterns,
which may relate to a common mechanism underlying the trading dynamics of
diverse companies. Further, our observation of long-range power-law
correlations and a parallel with the crossover in the scaling of absolute price
returns for each individual stock, support the hypothesis that the dynamics of
transaction times may play a role in the process of price formation.Comment: 8 pages, 5 figures. Presented at The Second Nikkei Econophysics
Workshop, Tokyo, 11-14 Nov. 2002. A subset appears in "The Application of
Econophysics: Proceedings of the Second Nikkei Econophysics Symposium",
editor H. Takayasu (Springer-Verlag, Tokyo, 2003) pp.51-57. Submitted to
Phys. Rev. E on 25 June 200
Effect of nonstationarities on detrended fluctuation analysis
Detrended fluctuation analysis (DFA) is a scaling analysis method used to
quantify long-range power-law correlations in signals. Many physical and
biological signals are ``noisy'', heterogeneous and exhibit different types of
nonstationarities, which can affect the correlation properties of these
signals. We systematically study the effects of three types of
nonstationarities often encountered in real data. Specifically, we consider
nonstationary sequences formed in three ways: (i) stitching together segments
of data obtained from discontinuous experimental recordings, or removing some
noisy and unreliable parts from continuous recordings and stitching together
the remaining parts -- a ``cutting'' procedure commonly used in preparing data
prior to signal analysis; (ii) adding to a signal with known correlations a
tunable concentration of random outliers or spikes with different amplitude,
and (iii) generating a signal comprised of segments with different properties
-- e.g. different standard deviations or different correlation exponents. We
compare the difference between the scaling results obtained for stationary
correlated signals and correlated signals with these three types of
nonstationarities.Comment: 17 pages, 10 figures, corrected some typos, added one referenc
Effect of Trends on Detrended Fluctuation Analysis
Detrended fluctuation analysis (DFA) is a scaling analysis method used to
estimate long-range power-law correlation exponents in noisy signals. Many
noisy signals in real systems display trends, so that the scaling results
obtained from the DFA method become difficult to analyze. We systematically
study the effects of three types of trends -- linear, periodic, and power-law
trends, and offer examples where these trends are likely to occur in real data.
We compare the difference between the scaling results for artificially
generated correlated noise and correlated noise with a trend, and study how
trends lead to the appearance of crossovers in the scaling behavior. We find
that crossovers result from the competition between the scaling of the noise
and the ``apparent'' scaling of the trend. We study how the characteristics of
these crossovers depend on (i) the slope of the linear trend; (ii) the
amplitude and period of the periodic trend; (iii) the amplitude and power of
the power-law trend and (iv) the length as well as the correlation properties
of the noise. Surprisingly, we find that the crossovers in the scaling of noisy
signals with trends also follow scaling laws -- i.e. long-range power-law
dependence of the position of the crossover on the parameters of the trends. We
show that the DFA result of noise with a trend can be exactly determined by the
superposition of the separate results of the DFA on the noise and on the trend,
assuming that the noise and the trend are not correlated. If this superposition
rule is not followed, this is an indication that the noise and the superimposed
trend are not independent, so that removing the trend could lead to changes in
the correlation properties of the noise.Comment: 20 pages, 16 figure
Robust Estimators in Generalized Pareto Models
This paper deals with optimally-robust parameter estimation in generalized
Pareto distributions (GPDs). These arise naturally in many situations where one
is interested in the behavior of extreme events as motivated by the
Pickands-Balkema-de Haan extreme value theorem (PBHT). The application we have
in mind is calculation of the regulatory capital required by Basel II for a
bank to cover operational risk. In this context the tail behavior of the
underlying distribution is crucial. This is where extreme value theory enters,
suggesting to estimate these high quantiles parameterically using, e.g. GPDs.
Robust statistics in this context offers procedures bounding the influence of
single observations, so provides reliable inference in the presence of moderate
deviations from the distributional model assumptions, respectively from the
mechanisms underlying the PBHT.Comment: 26pages, 6 figure
Spatial interactions in agent-based modeling
Agent Based Modeling (ABM) has become a widespread approach to model complex
interactions. In this chapter after briefly summarizing some features of ABM
the different approaches in modeling spatial interactions are discussed.
It is stressed that agents can interact either indirectly through a shared
environment and/or directly with each other. In such an approach, higher-order
variables such as commodity prices, population dynamics or even institutions,
are not exogenously specified but instead are seen as the results of
interactions. It is highlighted in the chapter that the understanding of
patterns emerging from such spatial interaction between agents is a key problem
as much as their description through analytical or simulation means.
The chapter reviews different approaches for modeling agents' behavior,
taking into account either explicit spatial (lattice based) structures or
networks. Some emphasis is placed on recent ABM as applied to the description
of the dynamics of the geographical distribution of economic activities, - out
of equilibrium. The Eurace@Unibi Model, an agent-based macroeconomic model with
spatial structure, is used to illustrate the potential of such an approach for
spatial policy analysis.Comment: 26 pages, 5 figures, 105 references; a chapter prepared for the book
"Complexity and Geographical Economics - Topics and Tools", P. Commendatore,
S.S. Kayam and I. Kubin, Eds. (Springer, in press, 2014
- …