36,710 research outputs found
Large Alphabets and Incompressibility
We briefly survey some concepts related to empirical entropy -- normal
numbers, de Bruijn sequences and Markov processes -- and investigate how well
it approximates Kolmogorov complexity. Our results suggest th-order
empirical entropy stops being a reasonable complexity metric for almost all
strings of length over alphabets of size about when surpasses
Correlation of Automorphism Group Size and Topological Properties with Program-size Complexity Evaluations of Graphs and Complex Networks
We show that numerical approximations of Kolmogorov complexity (K) applied to
graph adjacency matrices capture some group-theoretic and topological
properties of graphs and empirical networks ranging from metabolic to social
networks. That K and the size of the group of automorphisms of a graph are
correlated opens up interesting connections to problems in computational
geometry, and thus connects several measures and concepts from complexity
science. We show that approximations of K characterise synthetic and natural
networks by their generating mechanisms, assigning lower algorithmic randomness
to complex network models (Watts-Strogatz and Barabasi-Albert networks) and
high Kolmogorov complexity to (random) Erdos-Renyi graphs. We derive these
results via two different Kolmogorov complexity approximation methods applied
to the adjacency matrices of the graphs and networks. The methods used are the
traditional lossless compression approach to Kolmogorov complexity, and a
normalised version of a Block Decomposition Method (BDM) measure, based on
algorithmic probability theory.Comment: 15 2-column pages, 20 figures. Forthcoming in Physica A: Statistical
Mechanics and its Application
Kolmogorov complexity spectrum for use in analysis of UV-B radiation time series
We have used the Kolmogorov complexity and sample entropy measures to
estimate the complexity of the UV-B radiation time series in the Vojvodina
region (Serbia) for the period 1990-2007. We defined the Kolmogorov complexity
spectrum and have introduced the Kolmogorov complexity spectrum highest value
(KLM). We have established the UV-B radiation time series on the basis of their
daily sum (dose) for seven representative places in this region using (i)
measured data, (ii) data calculated via a derived empirical formula and (iii)
data obtained by a parametric UV radiation model. We have calculated the
Kolmogorov complexity (KL) based on the Lempel-Ziv Algorithm (LZA), KLM and
Sample Entropy (SE) values for each time series. We have divided the period
1990-2007 into two sub-intervals: (a) 1990-1998 and (b)1999-2007 and calculated
the KL, KLM and SE values for the various time series in these sub-intervals.
It is found that during the period 1999-2007, there is a decrease in the KL,
KLM and SE, comparing to the period 1990-1998. This complexity loss may be
attributed to (i) the increased human intervention in the post civil war period
causing increase of the air pollution and (ii) the increased cloudiness due to
climate changes.Comment: 10 pages, 1 figure, 1 table. arXiv admin note: substantial text
overlap with arXiv:1301.2039; This paper has been accepted in Modern Physics
Letters B on Aug 14, 201
Zipf's law and L. Levin's probability distributions
Zipf's law in its basic incarnation is an empirical probability distribution
governing the frequency of usage of words in a language. As Terence Tao
recently remarked, it still lacks a convincing and satisfactory mathematical
explanation.
In this paper I suggest that at least in certain situations, Zipf's law can
be explained as a special case of the a priori distribution introduced and
studied by L. Levin. The Zipf ranking corresponding to diminishing probability
appears then as the ordering determined by the growing Kolmogorov complexity.
One argument justifying this assertion is the appeal to a recent
interpretation by Yu. Manin and M. Marcolli of asymptotic bounds for
error--correcting codes in terms of phase transition. In the respective
partition function, Kolmogorov complexity of a code plays the role of its
energy.
This version contains minor corrections and additions.Comment: 19 page
Martingales, Efficient Market Hypothesis and Kolmogorov’s Complexity Theory
Efficient market theory states that financial markets can process information instantly. Empirical observations have challenged the stricter form of the efficient market hypothesis (EMH). These empirical observations and theoretical considerations show that price changes are difficult to predict if one starts from the time series of price changes. This paper provides an explanation in terms of algorithmic complexity theory of Kolmogorov that makes a clearer connection between the efficient market hypothesis and the unpredictable character of stock returns
- …