76 research outputs found
The Third Way of Agent-Based Social Simulation and a Computational Account of Emergence
Abstract This paper interprets a particular agent-based social simulation (ABSS) in terms of Conte's third way of understanding agent-based simulation. It is proposed that the normalized compression distance (derived from estimates of Kolmogorov complexity) between the initial and final macrolevel states of the ABSS provides a quantitative measure of the degree to which the results obtained via the ABSS might be obtained via a closedform expression. If the final macrolevel state of an ABSS can only be obtained by simulation, this confers on agent-based social simulations a special status. Future empirical (computational) work and epistemological analyses are proposed
MDL Convergence Speed for Bernoulli Sequences
The Minimum Description Length principle for online sequence
estimation/prediction in a proper learning setup is studied. If the underlying
model class is discrete, then the total expected square loss is a particularly
interesting performance measure: (a) this quantity is finitely bounded,
implying convergence with probability one, and (b) it additionally specifies
the convergence speed. For MDL, in general one can only have loss bounds which
are finite but exponentially larger than those for Bayes mixtures. We show that
this is even the case if the model class contains only Bernoulli distributions.
We derive a new upper bound on the prediction error for countable Bernoulli
classes. This implies a small bound (comparable to the one for Bayes mixtures)
for certain important model classes. We discuss the application to Machine
Learning tasks such as classification and hypothesis testing, and
generalization to countable classes of i.i.d. models.Comment: 28 page
Time and Space Bounds for Reversible Simulation
We prove a general upper bound on the tradeoff between time and space that
suffices for the reversible simulation of irreversible computation. Previously,
only simulations using exponential time or quadratic space were known.
The tradeoff shows for the first time that we can simultaneously achieve
subexponential time and subquadratic space.
The boundary values are the exponential time with hardly any extra space
required by the Lange-McKenzie-Tapp method and the ()th power time with
square space required by the Bennett method. We also give the first general
lower bound on the extra storage space required by general reversible
simulation. This lower bound is optimal in that it is achieved by some
reversible simulations.Comment: 11 pages LaTeX, Proc ICALP 2001, Lecture Notes in Computer Science,
Vol xxx Springer-Verlag, Berlin, 200
Algorithmic statistics: forty years later
Algorithmic statistics has two different (and almost orthogonal) motivations.
From the philosophical point of view, it tries to formalize how the statistics
works and why some statistical models are better than others. After this notion
of a "good model" is introduced, a natural question arises: it is possible that
for some piece of data there is no good model? If yes, how often these bad
("non-stochastic") data appear "in real life"?
Another, more technical motivation comes from algorithmic information theory.
In this theory a notion of complexity of a finite object (=amount of
information in this object) is introduced; it assigns to every object some
number, called its algorithmic complexity (or Kolmogorov complexity).
Algorithmic statistic provides a more fine-grained classification: for each
finite object some curve is defined that characterizes its behavior. It turns
out that several different definitions give (approximately) the same curve.
In this survey we try to provide an exposition of the main results in the
field (including full proofs for the most important ones), as well as some
historical comments. We assume that the reader is familiar with the main
notions of algorithmic information (Kolmogorov complexity) theory.Comment: Missing proofs adde
Entropy and Quantum Kolmogorov Complexity: A Quantum Brudno's Theorem
In classical information theory, entropy rate and Kolmogorov complexity per
symbol are related by a theorem of Brudno. In this paper, we prove a quantum
version of this theorem, connecting the von Neumann entropy rate and two
notions of quantum Kolmogorov complexity, both based on the shortest qubit
descriptions of qubit strings that, run by a universal quantum Turing machine,
reproduce them as outputs.Comment: 26 pages, no figures. Reference to publication added: published in
the Communications in Mathematical Physics
(http://www.springerlink.com/content/1432-0916/
Statistical physics of language dynamics
Language dynamics is a rapidly growing field that focuses on all processes related to the emergence, evolution, change and extinction of languages. Recently, the study of self-organization and evolution of language and meaning has led to the idea that a community of language users can be seen as a complex dynamical system, which collectively solves the problem of developing a shared communication framework through the back-and-forth signaling between individuals.
We shall review some of the progress made in the past few years and highlight potential future directions of research in this area. In particular, the emergence of a common lexicon and of a shared set of linguistic categories will be discussed, as examples corresponding to the early stages of a language. The extent to which synthetic modeling is nowadays contributing to the ongoing debate in cognitive science will be pointed out. In addition, the burst of growth of the web is providing new experimental frameworks. It makes available a huge amount of resources, both as novel tools and data to be analyzed, allowing quantitative and large-scale analysis of the processes underlying the emergence of a collective information and language dynamics
- …