9,277 research outputs found
Predictability: a way to characterize Complexity
Different aspects of the predictability problem in dynamical systems are
reviewed. The deep relation among Lyapunov exponents, Kolmogorov-Sinai entropy,
Shannon entropy and algorithmic complexity is discussed. In particular, we
emphasize how a characterization of the unpredictability of a system gives a
measure of its complexity. Adopting this point of view, we review some
developments in the characterization of the predictability of systems showing
different kind of complexity: from low-dimensional systems to high-dimensional
ones with spatio-temporal chaos and to fully developed turbulence. A special
attention is devoted to finite-time and finite-resolution effects on
predictability, which can be accounted with suitable generalization of the
standard indicators. The problems involved in systems with intrinsic randomness
is discussed, with emphasis on the important problems of distinguishing chaos
from noise and of modeling the system. The characterization of irregular
behavior in systems with discrete phase space is also considered.Comment: 142 Latex pgs. 41 included eps figures, submitted to Physics Reports.
Related information at this http://axtnt2.phys.uniroma1.i
Towards a Coherent Theory of Physics and Mathematics
As an approach to a Theory of Everything a framework for developing a
coherent theory of mathematics and physics together is described. The main
characteristic of such a theory is discussed: the theory must be valid and and
sufficiently strong, and it must maximally describe its own validity and
sufficient strength. The mathematical logical definition of validity is used,
and sufficient strength is seen to be a necessary and useful concept. The
requirement of maximal description of its own validity and sufficient strength
may be useful to reject candidate coherent theories for which the description
is less than maximal. Other aspects of a coherent theory discussed include
universal applicability, the relation to the anthropic principle, and possible
uniqueness. It is suggested that the basic properties of the physical and
mathematical universes are entwined with and emerge with a coherent theory.
Support for this includes the indirect reality status of properties of very
small or very large far away systems compared to moderate sized nearby systems.
Discussion of the necessary physical nature of language includes physical
models of language and a proof that the meaning content of expressions of any
axiomatizable theory seems to be independent of the algorithmic complexity of
the theory. G\"{o}del maps seem to be less useful for a coherent theory than
for purely mathematical theories because all symbols and words of any language
musthave representations as states of physical systems already in the domain of
a coherent theory.Comment: 38 pages, earlier version extensively revised and clarified. Accepted
for publication in Foundations of Physic
Algorithms for necklace maps
Necklace maps visualize quantitative data associated with regions by placing scaled symbols, usually disks, without overlap on a closed curve (the necklace) surrounding the map regions. Each region is projected onto an interval on the necklace that contains its symbol. In this paper we address the algorithmic question how to maximize symbol sizes while keeping symbols disjoint and inside their intervals. For that we reduce the problem to a one-dimensional problem which we solve efficiently. Solutions to the one-dimensional problem provide a very good approximation for the original necklace map problem. We consider two variants: Fixed-Order, where an order for the symbols on the necklace is given, and Any-Order where any symbol order is possible. The Fixed-Order problem can be solved in O(n log n) time. We show that the Any-Order problem is NP-hard for certain types of intervals and give an exact algorithm for the decision version. This algorithm is fixed-parameter tractable in the thickness K of the input. Our algorithm runs in O(n log n + n2K4K) time which can be improved to O(n log n + nK2K) time using a heuristic. We implemented our algorithm and evaluated it experimentally. Keywords: Necklace maps; scheduling; automated cartograph
Coding-theorem Like Behaviour and Emergence of the Universal Distribution from Resource-bounded Algorithmic Probability
Previously referred to as `miraculous' in the scientific literature because
of its powerful properties and its wide application as optimal solution to the
problem of induction/inference, (approximations to) Algorithmic Probability
(AP) and the associated Universal Distribution are (or should be) of the
greatest importance in science. Here we investigate the emergence, the rates of
emergence and convergence, and the Coding-theorem like behaviour of AP in
Turing-subuniversal models of computation. We investigate empirical
distributions of computing models in the Chomsky hierarchy. We introduce
measures of algorithmic probability and algorithmic complexity based upon
resource-bounded computation, in contrast to previously thoroughly investigated
distributions produced from the output distribution of Turing machines. This
approach allows for numerical approximations to algorithmic
(Kolmogorov-Chaitin) complexity-based estimations at each of the levels of a
computational hierarchy. We demonstrate that all these estimations are
correlated in rank and that they converge both in rank and values as a function
of computational power, despite fundamental differences between computational
models. In the context of natural processes that operate below the Turing
universal level because of finite resources and physical degradation, the
investigation of natural biases stemming from algorithmic rules may shed light
on the distribution of outcomes. We show that up to 60\% of the
simplicity/complexity bias in distributions produced even by the weakest of the
computational models can be accounted for by Algorithmic Probability in its
approximation to the Universal Distribution.Comment: 27 pages main text, 39 pages including supplement. Online complexity
calculator: http://complexitycalculator.com
PolyLogTools - Polylogs for the masses
We review recent developments in the study of multiple polylogarithms,
including the Hopf algebra of the multiple polylogarithms and the symbol map,
as well as the construction of single valued multiple polylogarithms and
discuss an algorithm for finding fibration bases. We document how these
algorithms are implemented in the Mathematica package PolyLogTools and show how
it can be used to study the coproduct structure of polylogarithmic expressions
and how to compute iterated parametric integrals over polylogarithmic
expressions that show up in Feynman integal computations at low loop orders.Comment: Package URL: https://gitlab.com/pltteam/pl
Driven by Compression Progress: A Simple Principle Explains Essential Aspects of Subjective Beauty, Novelty, Surprise, Interestingness, Attention, Curiosity, Creativity, Art, Science, Music, Jokes
I argue that data becomes temporarily interesting by itself to some
self-improving, but computationally limited, subjective observer once he learns
to predict or compress the data in a better way, thus making it subjectively
simpler and more beautiful. Curiosity is the desire to create or discover more
non-random, non-arbitrary, regular data that is novel and surprising not in the
traditional sense of Boltzmann and Shannon but in the sense that it allows for
compression progress because its regularity was not yet known. This drive
maximizes interestingness, the first derivative of subjective beauty or
compressibility, that is, the steepness of the learning curve. It motivates
exploring infants, pure mathematicians, composers, artists, dancers, comedians,
yourself, and (since 1990) artificial systems.Comment: 35 pages, 3 figures, based on KES 2008 keynote and ALT 2007 / DS 2007
joint invited lectur
Compression and diffusion: a joint approach to detect complexity
The adoption of the Kolmogorov-Sinai (KS) entropy is becoming a popular
research tool among physicists, especially when applied to a dynamical system
fitting the conditions of validity of the Pesin theorem. The study of time
series that are a manifestation of system dynamics whose rules are either
unknown or too complex for a mathematical treatment, is still a challenge since
the KS entropy is not computable, in general, in that case. Here we present a
plan of action based on the joint action of two procedures, both related to the
KS entropy, but compatible with computer implementation through fast and
efficient programs. The former procedure, called Compression Algorithm
Sensitive To Regularity (CASToRe), establishes the amount of order by the
numerical evaluation of algorithmic compressibility. The latter, called Complex
Analysis of Sequences via Scaling AND Randomness Assessment (CASSANDRA),
establishes the complexity degree through the numerical evaluation of the
strength of an anomalous effect. This is the departure, of the diffusion
process generated by the observed fluctuations, from ordinary Brownian motion.
The CASSANDRA algorithm shares with CASToRe a connection with the Kolmogorov
complexity. This makes both algorithms especially suitable to study the
transition from dynamics to thermodynamics, and the case of non-stationary time
series as well. The benefit of the joint action of these two methods is proven
by the analysis of artificial sequences with the same main properties as the
real time series to which the joint use of these two methods will be applied in
future research work.Comment: 27 pages, 9 figure
- …