695 research outputs found
"Thermometers" of Speculative Frenzy
Establishing unambiguously the existence of speculative bubbles is an
on-going controversy complicated by the need of defining a model of fundamental
prices. Here, we present a novel empirical method which bypasses all the
difficulties of the previous approaches by monitoring external indicators of an
anomalously growing interest in the public at times of bubbles. From the
definition of a bubble as a self-fulfilling reinforcing price change, we
identify indicators of a possible self-reinforcing imitation between agents in
the market. We show that during the build-up phase of a bubble, there is a
growing interest in the public for the commodity in question, whether it
consists in stocks, diamonds or coins. That interest can be estimated through
different indicators: increase in the number of books published on the topic,
increase in the subscriptions to specialized journals. Moreover, the well-known
empirical rule according to which the volume of sales is growing during a bull
market finds a natural interpretation in this framework: sales increases in
fact reveal and pinpoint the progress of the bubble's diffusion throughout
society. We also present a simple model of rational expectation which maps
exactly onto the Ising model on a random graph. The indicators are then
interpreted as ``thermometers'', measuring the balance between idiosyncratic
information (noise temperature) and imitation (coupling) strength. In this
context, bubbles are interpreted as low or critical temperature phases, where
the imitation strength carries market prices up essentially independently of
fundamentals. Contrary to the naive conception of a bubble and a crash as times
of disorder, on the contrary, we show that bubbles and crashes are times where
the concensus is too strong.Comment: 15 pages + 10 figure
The sharp peak-flat trough pattern and critical speculation
We find empirically a characteristic sharp peak-flat trough pattern in a
large set of commodity prices. We argue that the sharp peak structure reflects
an endogenous inter-market organization, and that peaks may be seen as local
``singularities'' resulting from imitation and herding. These findings impose a
novel stringent constraint on the construction of models. Intermittent
amplification is not sufficient and nonlinear effects seem necessary to account
for the observations.Comment: 20 pages, 6 figures (only fig.4 and 6 available in ps format), 3
tables, European Physical Journal B (in press
Celebrating the Physics in Geophysics
As 2005, the International Year of Physics, comes to an end, two physicists
working primarily in geophysical research reflect on how geophysics is not an
applied physics. Although geophysics has certainly benefited from progress in
physics and sometimes emulated the reductionist program of mainstream physics,
it has also educated the physics community about some of the generic behaviors
of strongly nonlinear systems. Dramatic examples are the insights we have
gained into the ``emergent'' phenomena of chaos, cascading instabilities,
turbulence, self-organization, fractal structure, power-law variability,
anomalous scaling, threshold dynamics, creep, fracture, and so on. In all of
these examples, relatively simple models have been able to explain the
recurring features of apparently very complex signals and fields. It appears
that the future of the intricate relation between physics and geophysics will
be as exciting as its past has been characterized by a mutual fascination.
Physics departments in our universities should capitalize on this trend to
attract and retain young talent motivated to address problems that really
matter for the future of the planet. A pressing topic with huge impact on
populations and that is challenging enough for both physics and geophysics
communities to work together like never before is the understanding and
prediction of extreme events.Comment: 6 pages, final version to appear in EOS-AGU Transactions in November
200
Self-Similar Log-Periodic Structures in Western Stock Markets from 2000
The presence of log-periodic structures before and after stock market crashes
is considered to be an imprint of an intrinsic discrete scale invariance (DSI)
in this complex system. The fractal framework of the theory leaves open the
possibility of observing self-similar log-periodic structures at different time
scales. In the present work we analyze the daily closures of three of the most
important indices worldwide since 2000: the DAX for Germany and the Nasdaq100
and the S&P500 for the United States. The qualitative behaviour of these
different markets is similar during the temporal frame studied. Evidence is
found for decelerating log-periodic oscillations of duration about two years
and starting in September 2000. Moreover, a nested sub-structure starting in
May 2002 is revealed, bringing more evidence to support the hypothesis of
self-similar, log-periodic behavior. Ongoing log-periodic oscillations are also
revealed. A Lomb analysis over the aforementioned periods indicates a
preferential scaling factor . Higher order harmonics are also
present. The spectral pattern of the data has been found to be similar to that
of a Weierstrass-type function, used as a prototype of a log-periodic fractal
function.Comment: 17 pages, 14 figures. International Journal of Modern Physics C, in
pres
Predicted and Verified Deviations from Zipf's law in Ecology of Competing Products
Zipf's power-law distribution is a generic empirical statistical regularity
found in many complex systems. However, rather than universality with a single
power-law exponent (equal to 1 for Zipf's law), there are many reported
deviations that remain unexplained. A recently developed theory finds that the
interplay between (i) one of the most universal ingredients, namely stochastic
proportional growth, and (ii) birth and death processes, leads to a generic
power-law distribution with an exponent that depends on the characteristics of
each ingredient. Here, we report the first complete empirical test of the
theory and its application, based on the empirical analysis of the dynamics of
market shares in the product market. We estimate directly the average growth
rate of market shares and its standard deviation, the birth rates and the
"death" (hazard) rate of products. We find that temporal variations and product
differences of the observed power-law exponents can be fully captured by the
theory with no adjustable parameters. Our results can be generalized to many
systems for which the statistical properties revealed by power law exponents
are directly linked to the underlying generating mechanism
Response Functions to Critical Shocks in Social Sciences: An Empirical and Numerical Study
We show that, provided one focuses on properly selected episodes, one can
apply to the social sciences the same observational strategy that has proved
successful in natural sciences such as astrophysics or geodynamics. For
instance, in order to probe the cohesion of a policy, one can, in different
countries, study the reactions to some huge and sudden exogenous shocks, which
we call Dirac shocks. This approach naturally leads to the notion of structural
(as opposed or complementary to temporal) forecast. Although structural
predictions are by far the most common way to test theories in the natural
sciences, they have been much less used in the social sciences. The Dirac shock
approach opens the way to testing structural predictions in the social
sciences. The examples reported here suggest that critical events are able to
reveal pre-existing ``cracks'' because they probe the social cohesion which is
an indicator and predictor of future evolution of the system, and in some cases
foreshadows a bifurcation. We complement our empirical work with numerical
simulations of the response function (``damage spreading'') to Dirac shocks in
the Sznajd model of consensus build-up. We quantify the slow relaxation of the
difference between perturbed and unperturbed systems, the conditions under
which the consensus is modified by the shock and the large variability from one
realization to another
Analysis of the phenomenon of speculative trading in one of its basic manifestations: postage stamp bubbles
We document and analyze the empirical facts concerning one of the clearest
evidence of speculation in financial trading as observed in the postage
collection stamp market. We unravel some of the mechanisms of speculative
behavior which emphasize the role of fancy and collective behavior. In our
conclusion, we propose a classification of speculative markets based on two
parameters, namely the amplitude of the price peak and a second parameter that
measures its ``sharpness''. This study is offered to anchor modeling efforts to
realistic market constraints and observations.Comment: 9 pages, 5 figures and 2 tables, in press in Int. J. Mod. Phys.
Persistence and Quiescence of Seismicity on Fault Systems
We study the statistics of simulated earthquakes in a quasistatic model of
two parallel heterogeneous faults within a slowly driven elastic tectonic
plate. The probability that one fault remains dormant while the other is active
for a time Dt following the previous activity shift is proportional to the
inverse of Dt to the power 1+x, a result that is robust in the presence of
annealed noise and strength weakening. A mean field theory accounts for the
observed dependence of the persistence exponent x as a function of
heterogeneity and distance between faults. These results continue to hold if
the number of competing faults is increased. This is related to the persistence
phenomenon discovered in a large variety of systems, which specifies how long a
relaxing dynamical system remains in a neighborhood of its initial
configuration. Our persistence exponent is found to vary as a function of
heterogeneity and distance between faults, thus defining a novel universality
class.Comment: 4 pages, 3 figures, Revte
Illusory versus genuine control in agent-based games
In the Minority, Majority and Dollar Games (MG, MAJG, G agents attempt to predict and benefit both from trends and changes in the direction of a market. It has been previously shown that in the MG for a reasonable number of preliminary time steps preceding equilibrium (Time Horizon MG, THMG), agents' attempt to optimize their gains by active strategy selection is "illusory”: the hypothetical gains of their strategies is greater on average than agents' actual average gains. Furthermore, if a small proportion of agents deliberately choose and act in accord with their seemingly worst performing strategy, these outperform all other agents on average, and even attain mean positive gain, otherwise rare for agents in the MG. This latter phenomenon raises the question as to how well the optimization procedure works in the THMAJG and THG. This provides further clarification of the kinds of situations subject to genuine control, and those not, in set-ups a priori defined to emphasize the importance of optimizatio
"Illusion of control" in Time-Horizon Minority and ParrondoGames
Abstract.: Human beings like to believe they are in control of their destiny. This ubiquitous trait seems to increase motivation and persistence, and is probably evolutionarily adaptive [J.D. Taylor, S.E. Brown, Psych. Bull. 103, 193 (1988); A. Bandura, Self-efficacy: the exercise of control (WH Freeman, New York, 1997)]. But how good really is our ability to control? How successful is our track record in these areas? There is little understanding of when and under what circumstances we may over-estimate [E. Langer, J. Pers. Soc. Psych. 7, 185 (1975)] or even lose our ability to control and optimize outcomes, especially when they are the result of aggregations of individual optimization processes. Here, we demonstrate analytically using the theory of Markov Chains and by numerical simulations in two classes of games, the Time-Horizon Minority Game [M.L. Hart, P. Jefferies, N.F. Johnson, Phys. A 311, 275 (2002)] and the Parrondo Game [J.M.R. Parrondo, G.P. Harmer, D. Abbott, Phys. Rev. Lett. 85, 5226 (2000); J.M.R. Parrondo, How to cheat a bad mathematician (ISI, Italy, 1996)], that agents who optimize their strategy based on past information may actually perform worse than non-optimizing agents. In other words, low-entropy (more informative) strategies under-perform high-entropy (or random) strategies. This provides a precise definition of the "illusion of control” in certain set-ups a priori defined to emphasize the importance of optimizatio
- …