208 research outputs found
Empirical Encounters with Computational Irreducibility and Unpredictability
There are several forms of irreducibility in computing systems, ranging from
undecidability to intractability to nonlinearity. This paper is an exploration
of the conceptual issues that have arisen in the course of investigating
speed-up and slowdown phenomena in small Turing machines. We present the
results of a test that may spur experimental approaches to the notion of
computational irreducibility. The test involves a systematic attempt to outrun
the computation of a large number of small Turing machines (all 3 and 4 state,
2 symbol) by means of integer sequence prediction using a specialized function
finder program. This massive experiment prompts an investigation into rates of
convergence of decision procedures and the decidability of sets in addition to
a discussion of the (un)predictability of deterministic computing systems in
practice. We think this investigation constitutes a novel approach to the
discussion of an epistemological question in the context of a computer
simulation, and thus represents an interesting exploration at the boundary
between philosophical concerns and computational experiments.Comment: 18 pages, 4 figure
Computational Irreducibility and Computational Analogy
In a previous paper [1], we provided a formal definition for the concept of computational irreducibility (CIR), that is, the fact that for a function f from N to N it is impossible to compute f (n) without following approximately the same path as computing successively all the values f (i) from i = 1 to n. Our definition is based on the concept of enumerating Turing machines (E-Turing machines) and on the concept of approximation of E-Turing machines, for which we also gave a formal definition. Here, we make these definitions more precise through some modifications intended to improve the robustness of the concept. We then introduce a new concept: the computational analogy, and prove some properties of the functions used. Computational analogy is an equivalence relation that allows partitioning the set of computable functions in classes whose members have the same properties regarding their CIR and their computational complexity. Introduction 1
On the necessity of complexity
Wolfram's Principle of Computational Equivalence (PCE) implies that universal
complexity abounds in nature. This paper comprises three sections. In the first
section we consider the question why there are so many universal phenomena
around. So, in a sense, we week a driving force behind the PCE if any. We
postulate a principle GNS that we call the Generalized Natural Selection
Principle that together with the Church-Turing Thesis is seen to be equivalent
to a weak version of PCE. In the second section we ask the question why we do
not observe any phenomena that are complex but not-universal. We choose a
cognitive setting to embark on this question and make some analogies with
formal logic. In the third and final section we report on a case study where we
see rich structures arise everywhere.Comment: 17 pages, 3 figure
Challenges in Complex Systems Science
FuturICT foundations are social science, complex systems science, and ICT.
The main concerns and challenges in the science of complex systems in the
context of FuturICT are laid out in this paper with special emphasis on the
Complex Systems route to Social Sciences. This include complex systems having:
many heterogeneous interacting parts; multiple scales; complicated transition
laws; unexpected or unpredicted emergence; sensitive dependence on initial
conditions; path-dependent dynamics; networked hierarchical connectivities;
interaction of autonomous agents; self-organisation; non-equilibrium dynamics;
combinatorial explosion; adaptivity to changing environments; co-evolving
subsystems; ill-defined boundaries; and multilevel dynamics. In this context,
science is seen as the process of abstracting the dynamics of systems from
data. This presents many challenges including: data gathering by large-scale
experiment, participatory sensing and social computation, managing huge
distributed dynamic and heterogeneous databases; moving from data to dynamical
models, going beyond correlations to cause-effect relationships, understanding
the relationship between simple and comprehensive models with appropriate
choices of variables, ensemble modeling and data assimilation, modeling systems
of systems of systems with many levels between micro and macro; and formulating
new approaches to prediction, forecasting, and risk, especially in systems that
can reflect on and change their behaviour in response to predictions, and
systems whose apparently predictable behaviour is disrupted by apparently
unpredictable rare or extreme events. These challenges are part of the FuturICT
agenda
- …