650 research outputs found
On the Foundations of the Theory of Evolution
Darwinism conceives evolution as a consequence of random variation and
natural selection, hence it is based on a materialistic, i.e. matter-based,
view of science inspired by classical physics. But matter in itself is
considered a very complex notion in modern physics. More specifically, at a
microscopic level, matter and energy are no longer retained within their simple
form, and quantum mechanical models are proposed wherein potential form is
considered in addition to actual form. In this paper we propose an alternative
to standard Neodarwinian evolution theory. We suggest that the starting point
of evolution theory cannot be limited to actual variation whereupon is
selected, but to variation in the potential of entities according to the
context. We therefore develop a formalism, referred to as Context driven
Actualization of Potential (CAP), which handles potentiality and describes the
evolution of entities as an actualization of potential through a reiterated
interaction with the context. As in quantum mechanics, lack of knowledge of the
entity, its context, or the interaction between context and entity leads to
different forms of indeterminism in relation to the state of the entity. This
indeterminism generates a non-Kolmogorovian distribution of probabilities that
is different from the classical distribution of chance described by Darwinian
evolution theory, which stems from a 'actuality focused', i.e. materialistic,
view of nature. We also present a quantum evolution game that highlights the
main differences arising from our new perspective and shows that it is more
fundamental to consider evolution in general, and biological evolution in
specific, as a process of actualization of potential induced by context, for
which its material reduction is only a special case.Comment: 11 pages, no figure
On the Computational Power of DNA Annealing and Ligation
In [20] it was shown that the DNA primitives of Separate,
Merge, and Amplify were not sufficiently powerful to invert
functions defined by circuits in linear time. Dan Boneh et
al [4] show that the addition of a ligation primitive, Append, provides the missing power. The question becomes, "How powerful is ligation? Are Separate, Merge, and Amplify
necessary at all?" This paper proposes to informally explore
the power of annealing and ligation for DNA computation.
We conclude, in fact, that annealing and ligation alone are
theoretically capable of universal computation
Active Self-Assembly of Algorithmic Shapes and Patterns in Polylogarithmic Time
We describe a computational model for studying the complexity of
self-assembled structures with active molecular components. Our model captures
notions of growth and movement ubiquitous in biological systems. The model is
inspired by biology's fantastic ability to assemble biomolecules that form
systems with complicated structure and dynamics, from molecular motors that
walk on rigid tracks and proteins that dynamically alter the structure of the
cell during mitosis, to embryonic development where large-scale complicated
organisms efficiently grow from a single cell. Using this active self-assembly
model, we show how to efficiently self-assemble shapes and patterns from simple
monomers. For example, we show how to grow a line of monomers in time and
number of monomer states that is merely logarithmic in the length of the line.
Our main results show how to grow arbitrary connected two-dimensional
geometric shapes and patterns in expected time that is polylogarithmic in the
size of the shape, plus roughly the time required to run a Turing machine
deciding whether or not a given pixel is in the shape. We do this while keeping
the number of monomer types logarithmic in shape size, plus those monomers
required by the Kolmogorov complexity of the shape or pattern. This work thus
highlights the efficiency advantages of active self-assembly over passive
self-assembly and motivates experimental effort to construct general-purpose
active molecular self-assembly systems
Reasoning about the garden of forking paths
Lazy evaluation is a powerful tool for functional programmers. It enables the
concise expression of on-demand computation and a form of compositionality not
available under other evaluation strategies. However, the stateful nature of
lazy evaluation makes it hard to analyze a program's computational cost, either
informally or formally. In this work, we present a novel and simple framework
for formally reasoning about lazy computation costs based on a recent model of
lazy evaluation: clairvoyant call-by-value. The key feature of our framework is
its simplicity, as expressed by our definition of the clairvoyance monad. This
monad is both simple to define (around 20 lines of Coq) and simple to reason
about. We show that this monad can be effectively used to mechanically reason
about the computational cost of lazy functional programs written in Coq.Comment: 28 pages, accepted by ICFP'2
- …