10,150 research outputs found
An empirical behavioral model of liquidity and volatility
We develop a behavioral model for liquidity and volatility based on empirical
regularities in trading order flow in the London Stock Exchange. This can be
viewed as a very simple agent based model in which all components of the model
are validated against real data. Our empirical studies of order flow uncover
several interesting regularities in the way trading orders are placed and
cancelled. The resulting simple model of order flow is used to simulate price
formation under a continuous double auction, and the statistical properties of
the resulting simulated sequence of prices are compared to those of real data.
The model is constructed using one stock (AZN) and tested on 24 other stocks.
For low volatility, small tick size stocks (called Group I) the predictions are
very good, but for stocks outside Group I they are not good. For Group I, the
model predicts the correct magnitude and functional form of the distribution of
the volatility and the bid-ask spread, without adjusting any parameters based
on prices. This suggests that at least for Group I stocks, the volatility and
heavy tails of prices are related to market microstructure effects, and
supports the hypothesis that, at least on short time scales, the large
fluctuations of absolute returns are well described by a power law with an
exponent that varies from stock to stock
Estimating the Algorithmic Complexity of Stock Markets
Randomness and regularities in Finance are usually treated in probabilistic
terms. In this paper, we develop a completely different approach in using a
non-probabilistic framework based on the algorithmic information theory
initially developed by Kolmogorov (1965). We present some elements of this
theory and show why it is particularly relevant to Finance, and potentially to
other sub-fields of Economics as well. We develop a generic method to estimate
the Kolmogorov complexity of numeric series. This approach is based on an
iterative "regularity erasing procedure" implemented to use lossless
compression algorithms on financial data. Examples are provided with both
simulated and real-world financial time series. The contributions of this
article are twofold. The first one is methodological : we show that some
structural regularities, invisible with classical statistical tests, can be
detected by this algorithmic method. The second one consists in illustrations
on the daily Dow-Jones Index suggesting that beyond several well-known
regularities, hidden structure may in this index remain to be identified
Decoherent Histories Quantum Mechanics with One 'Real' Fine-Grained History
Decoherent histories quantum theory is reformulated with the assumption that
there is one "real" fine-grained history, specified in a preferred complete set
of sum-over-histories variables. This real history is described by embedding it
in an ensemble of comparable imagined fine-grained histories, not unlike the
familiar ensemble of statistical mechanics. These histories are assigned
extended probabilities, which can sometimes be negative or greater than one. As
we will show, this construction implies that the real history is not completely
accessible to experimental or other observational discovery. However,
sufficiently and appropriately coarse-grained sets of alternative histories
have standard probabilities providing information about the real fine-grained
history that can be compared with observation. We recover the probabilities of
decoherent histories quantum mechanics for sets of histories that are recorded
and therefore decohere. Quantum mechanics can be viewed as a classical
stochastic theory of histories with extended probabilities and a well-defined
notion of reality common to all decoherent sets of alternative coarse-grained
histories.Comment: 11 pages, one figure, expanded discussion and acknowledgment
a variational approach to niche construction
In evolutionary biology, niche construction is sometimes described as a genuine evolutionary process whereby organisms, through their activities and regulatory mechanisms, modify their environment such as to steer their own evolutionary trajectory, and that of other species. There is ongoing debate, however, on the extent to which niche construction ought to be considered a bona fide evolutionary force, on a par with natural selection. Recent formulations of the variational free-energy principle as applied to the life sciences describe the properties of living systems, and their selection in evolution, in terms of variational inference. We argue that niche construction can be described using a variational approach. We propose new arguments to support the niche construction perspective, and to extend the variational approach to niche construction to current perspectives in various scientific fields
Visual motion processing and human tracking behavior
The accurate visual tracking of a moving object is a human fundamental skill
that allows to reduce the relative slip and instability of the object's image
on the retina, thus granting a stable, high-quality vision. In order to
optimize tracking performance across time, a quick estimate of the object's
global motion properties needs to be fed to the oculomotor system and
dynamically updated. Concurrently, performance can be greatly improved in terms
of latency and accuracy by taking into account predictive cues, especially
under variable conditions of visibility and in presence of ambiguous retinal
information. Here, we review several recent studies focusing on the integration
of retinal and extra-retinal information for the control of human smooth
pursuit.By dynamically probing the tracking performance with well established
paradigms in the visual perception and oculomotor literature we provide the
basis to test theoretical hypotheses within the framework of dynamic
probabilistic inference. We will in particular present the applications of
these results in light of state-of-the-art computer vision algorithms
- …