27,327 research outputs found
Estimating the Algorithmic Complexity of Stock Markets
Randomness and regularities in Finance are usually treated in probabilistic
terms. In this paper, we develop a completely different approach in using a
non-probabilistic framework based on the algorithmic information theory
initially developed by Kolmogorov (1965). We present some elements of this
theory and show why it is particularly relevant to Finance, and potentially to
other sub-fields of Economics as well. We develop a generic method to estimate
the Kolmogorov complexity of numeric series. This approach is based on an
iterative "regularity erasing procedure" implemented to use lossless
compression algorithms on financial data. Examples are provided with both
simulated and real-world financial time series. The contributions of this
article are twofold. The first one is methodological : we show that some
structural regularities, invisible with classical statistical tests, can be
detected by this algorithmic method. The second one consists in illustrations
on the daily Dow-Jones Index suggesting that beyond several well-known
regularities, hidden structure may in this index remain to be identified
Curriculum Guidelines for Undergraduate Programs in Data Science
The Park City Math Institute (PCMI) 2016 Summer Undergraduate Faculty Program
met for the purpose of composing guidelines for undergraduate programs in Data
Science. The group consisted of 25 undergraduate faculty from a variety of
institutions in the U.S., primarily from the disciplines of mathematics,
statistics and computer science. These guidelines are meant to provide some
structure for institutions planning for or revising a major in Data Science
A Complete Theory of Everything (will be subjective)
Increasingly encompassing models have been suggested for our world. Theories
range from generally accepted to increasingly speculative to apparently bogus.
The progression of theories from ego- to geo- to helio-centric models to
universe and multiverse theories and beyond was accompanied by a dramatic
increase in the sizes of the postulated worlds, with humans being expelled from
their center to ever more remote and random locations. Rather than leading to a
true theory of everything, this trend faces a turning point after which the
predictive power of such theories decreases (actually to zero). Incorporating
the location and other capacities of the observer into such theories avoids
this problem and allows to distinguish meaningful from predictively meaningless
theories. This also leads to a truly complete theory of everything consisting
of a (conventional objective) theory of everything plus a (novel subjective)
observer process. The observer localization is neither based on the
controversial anthropic principle, nor has it anything to do with the
quantum-mechanical observation process. The suggested principle is extended to
more practical (partial, approximate, probabilistic, parametric) world models
(rather than theories of everything). Finally, I provide a justification of
Ockham's razor, and criticize the anthropic principle, the doomsday argument,
the no free lunch theorem, and the falsifiability dogma.Comment: 26 LaTeX page
- …