12,230 research outputs found
A simple model for the evolution of molecular codes driven by the interplay of accuracy, diversity and cost
Molecular codes translate information written in one type of molecules into
another molecular language. We introduce a simple model that treats molecular
codes as noisy information channels. An optimal code is a channel that conveys
information accurately and efficiently while keeping down the impact of errors.
The equipoise of the three conflicting needs, for minimal error-load, minimal
cost of resources and maximal diversity of vocabulary, defines the fitness of
the code. The model suggests a mechanism for the emergence of a code when
evolution varies the parameters that control this equipoise and the mapping
between the two molecular languages becomes non-random. This mechanism is
demonstrated by a simple toy model that is formally equivalent to a mean-field
Ising magnet.Comment: Keywords: molecular codes, rate-distortion theory, biological
information channels, stochastic maps, genetic code, genetic network
The Value of Information for Populations in Varying Environments
The notion of information pervades informal descriptions of biological
systems, but formal treatments face the problem of defining a quantitative
measure of information rooted in a concept of fitness, which is itself an
elusive notion. Here, we present a model of population dynamics where this
problem is amenable to a mathematical analysis. In the limit where any
information about future environmental variations is common to the members of
the population, our model is equivalent to known models of financial
investment. In this case, the population can be interpreted as a portfolio of
financial assets and previous analyses have shown that a key quantity of
Shannon's communication theory, the mutual information, sets a fundamental
limit on the value of information. We show that this bound can be violated when
accounting for features that are irrelevant in finance but inherent to
biological systems, such as the stochasticity present at the individual level.
This leads us to generalize the measures of uncertainty and information usually
encountered in information theory
Structure or Noise?
We show how rate-distortion theory provides a mechanism for automated theory
building by naturally distinguishing between regularity and randomness. We
start from the simple principle that model variables should, as much as
possible, render the future and past conditionally independent. From this, we
construct an objective function for model making whose extrema embody the
trade-off between a model's structural complexity and its predictive power. The
solutions correspond to a hierarchy of models that, at each level of
complexity, achieve optimal predictive power at minimal cost. In the limit of
maximal prediction the resulting optimal model identifies a process's intrinsic
organization by extracting the underlying causal states. In this limit, the
model's complexity is given by the statistical complexity, which is known to be
minimal for achieving maximum prediction. Examples show how theory building can
profit from analyzing a process's causal compressibility, which is reflected in
the optimal models' rate-distortion curve--the process's characteristic for
optimally balancing structure and noise at different levels of representation.Comment: 6 pages, 2 figures;
http://cse.ucdavis.edu/~cmg/compmech/pubs/son.htm
Predictability, complexity and learning
We define {\em predictive information} as the mutual
information between the past and the future of a time series. Three
qualitatively different behaviors are found in the limit of large observation
times : can remain finite, grow logarithmically, or grow
as a fractional power law. If the time series allows us to learn a model with a
finite number of parameters, then grows logarithmically with
a coefficient that counts the dimensionality of the model space. In contrast,
power--law growth is associated, for example, with the learning of infinite
parameter (or nonparametric) models such as continuous functions with
smoothness constraints. There are connections between the predictive
information and measures of complexity that have been defined both in learning
theory and in the analysis of physical systems through statistical mechanics
and dynamical systems theory. Further, in the same way that entropy provides
the unique measure of available information consistent with some simple and
plausible conditions, we argue that the divergent part of
provides the unique measure for the complexity of dynamics underlying a time
series. Finally, we discuss how these ideas may be useful in different problems
in physics, statistics, and biology.Comment: 53 pages, 3 figures, 98 references, LaTeX2
Occam's Quantum Strop: Synchronizing and Compressing Classical Cryptic Processes via a Quantum Channel
A stochastic process's statistical complexity stands out as a fundamental
property: the minimum information required to synchronize one process generator
to another. How much information is required, though, when synchronizing over a
quantum channel? Recent work demonstrated that representing causal similarity
as quantum state-indistinguishability provides a quantum advantage. We
generalize this to synchronization and offer a sequence of constructions that
exploit extended causal structures, finding substantial increase of the quantum
advantage. We demonstrate that maximum compression is determined by the
process's cryptic order---a classical, topological property closely allied to
Markov order, itself a measure of historical dependence. We introduce an
efficient algorithm that computes the quantum advantage and close noting that
the advantage comes at a cost---one trades off prediction for generation
complexity.Comment: 10 pages, 6 figures;
http://csc.ucdavis.edu/~cmg/compmech/pubs/oqs.ht
- âŠ