245 research outputs found
Research and applications: Artificial intelligence
The program is reported for developing techniques in artificial intelligence and their application to the control of mobile automatons for carrying out tasks autonomously. Visual scene analysis, short-term problem solving, and long-term problem solving are discussed along with the PDP-15 simulator, LISP-FORTRAN-MACRO interface, resolution strategies, and cost effectiveness
Chaos and quantum-nondemolition measurements
The problem of chaotic behavior in quantum mechanics is investigated against the background of the theory of quantum-nondemolition (QND) measurements. The analysis is based on two relevant features: The outcomes of a sequence of QND measurements are unambiguously predictable, and these measurements actually can be performed on one single system without perturbing its time evolution. Consequently, QND measurements represent an appropriate framework to analyze the conditions for the occurrence of ‘‘deterministic randomness’’ in quantum systems. The general arguments are illustrated by a discussion of a quantum system with a time evolution that possesses nonvanishing algorithmic complexity
Does the universe in fact contain almost no information?
At first sight, an accurate description of the state of the universe appears
to require a mind-bogglingly large and perhaps even infinite amount of
information, even if we restrict our attention to a small subsystem such as a
rabbit. In this paper, it is suggested that most of this information is merely
apparent, as seen from our subjective viewpoints, and that the algorithmic
information content of the universe as a whole is close to zero. It is argued
that if the Schr\"odinger equation is universally valid, then decoherence
together with the standard chaotic behavior of certain non-linear systems will
make the universe appear extremely complex to any self-aware subsets that
happen to inhabit it now, even if it was in a quite simple state shortly after
the big bang. For instance, gravitational instability would amplify the
microscopic primordial density fluctuations that are required by the Heisenberg
uncertainty principle into quite macroscopic inhomogeneities, forcing the
current wavefunction of the universe to contain such Byzantine superpositions
as our planet being in many macroscopically different places at once. Since
decoherence bars us from experiencing more than one macroscopic reality, we
would see seemingly complex constellations of stars etc, even if the initial
wavefunction of the universe was perfectly homogeneous and isotropic.Comment: 17 pages, LATeX, no figures. Online with refs at
http://astro.berkeley.edu/~max/nihilo.html (faster from the US), from
http://www.mpa-garching.mpg.de/~max/nihilo.html (faster from Europe) or from
[email protected]
Pseudorandom Number Generators and the Square Site Percolation Threshold
A select collection of pseudorandom number generators is applied to a Monte
Carlo study of the two dimensional square site percolation model. A generator
suitable for high precision calculations is identified from an application
specific test of randomness. After extended computation and analysis, an
ostensibly reliable value of pc = 0.59274598(4) is obtained for the percolation
threshold.Comment: 11 pages, 6 figure
AGI and the Knight-Darwin Law: why idealized AGI reproduction requires collaboration
Can an AGI create a more intelligent AGI? Under idealized assumptions, for a certain theoretical type of intelligence, our answer is: “Not without outside help”. This is a paper on the mathematical structure of AGI populations when parent AGIs create child AGIs. We argue that such populations satisfy a certain biological law. Motivated by observations of sexual reproduction in seemingly-asexual species, the Knight-Darwin Law states that it is impossible for one organism to asexually produce another, which asexually produces another, and so on forever: that any sequence of organisms (each one a child of the previous) must contain occasional multi-parent organisms, or must terminate. By proving that a certain measure (arguably an intelligence measure) decreases when an idealized parent AGI single-handedly creates a child AGI, we argue that a similar Law holds for AGIs
Computational and Biological Analogies for Understanding Fine-Tuned Parameters in Physics
In this philosophical paper, we explore computational and biological
analogies to address the fine-tuning problem in cosmology. We first clarify
what it means for physical constants or initial conditions to be fine-tuned. We
review important distinctions such as the dimensionless and dimensional
physical constants, and the classification of constants proposed by
Levy-Leblond. Then we explore how two great analogies, computational and
biological, can give new insights into our problem. This paper includes a
preliminary study to examine the two analogies. Importantly, analogies are both
useful and fundamental cognitive tools, but can also be misused or
misinterpreted. The idea that our universe might be modelled as a computational
entity is analysed, and we discuss the distinction between physical laws and
initial conditions using algorithmic information theory. Smolin introduced the
theory of "Cosmological Natural Selection" with a biological analogy in mind.
We examine an extension of this analogy involving intelligent life. We discuss
if and how this extension could be legitimated.
Keywords: origin of the universe, fine-tuning, physical constants, initial
conditions, computational universe, biological universe, role of intelligent
life, cosmological natural selection, cosmological artificial selection,
artificial cosmogenesis.Comment: 25 pages, Foundations of Science, in pres
Coarse-graining of cellular automata, emergence, and the predictability of complex systems
We study the predictability of emergent phenomena in complex systems. Using
nearest neighbor, one-dimensional Cellular Automata (CA) as an example, we show
how to construct local coarse-grained descriptions of CA in all classes of
Wolfram's classification. The resulting coarse-grained CA that we construct are
capable of emulating the large-scale behavior of the original systems without
accounting for small-scale details. Several CA that can be coarse-grained by
this construction are known to be universal Turing machines; they can emulate
any CA or other computing devices and are therefore undecidable. We thus show
that because in practice one only seeks coarse-grained information, complex
physical systems can be predictable and even decidable at some level of
description. The renormalization group flows that we construct induce a
hierarchy of CA rules. This hierarchy agrees well with apparent rule complexity
and is therefore a good candidate for a complexity measure and a classification
method. Finally we argue that the large scale dynamics of CA can be very
simple, at least when measured by the Kolmogorov complexity of the large scale
update rule, and moreover exhibits a novel scaling law. We show that because of
this large-scale simplicity, the probability of finding a coarse-grained
description of CA approaches unity as one goes to increasingly coarser scales.
We interpret this large scale simplicity as a pattern formation mechanism in
which large scale patterns are forced upon the system by the simplicity of the
rules that govern the large scale dynamics.Comment: 18 pages, 9 figure
The Computational Complexity of Symbolic Dynamics at the Onset of Chaos
In a variety of studies of dynamical systems, the edge of order and chaos has
been singled out as a region of complexity. It was suggested by Wolfram, on the
basis of qualitative behaviour of cellular automata, that the computational
basis for modelling this region is the Universal Turing Machine. In this paper,
following a suggestion of Crutchfield, we try to show that the Turing machine
model may often be too powerful as a computational model to describe the
boundary of order and chaos. In particular we study the region of the first
accumulation of period doubling in unimodal and bimodal maps of the interval,
from the point of view of language theory. We show that in relation to the
``extended'' Chomsky hierarchy, the relevant computational model in the
unimodal case is the nested stack automaton or the related indexed languages,
while the bimodal case is modeled by the linear bounded automaton or the
related context-sensitive languages.Comment: 1 reference corrected, 1 reference added, minor changes in body of
manuscrip
Universal fluctuations in subdiffusive transport
Subdiffusive transport in tilted washboard potentials is studied within the
fractional Fokker-Planck equation approach, using the associated continuous
time random walk (CTRW) framework. The scaled subvelocity is shown to obey a
universal law, assuming the form of a stationary Levy-stable distribution. The
latter is defined by the index of subdiffusion alpha and the mean subvelocity
only, but interestingly depends neither on the bias strength nor on the
specific form of the potential. These scaled, universal subvelocity
fluctuations emerge due to the weak ergodicity breaking and are vanishing in
the limit of normal diffusion. The results of the analytical heuristic theory
are corroborated by Monte Carlo simulations of the underlying CTRW
- …