601 research outputs found
2Planning for Contingencies: A Decision-based Approach
A fundamental assumption made by classical AI planners is that there is no
uncertainty in the world: the planner has full knowledge of the conditions
under which the plan will be executed and the outcome of every action is fully
predictable. These planners cannot therefore construct contingency plans, i.e.,
plans in which different actions are performed in different circumstances. In
this paper we discuss some issues that arise in the representation and
construction of contingency plans and describe Cassandra, a partial-order
contingency planner. Cassandra uses explicit decision-steps that enable the
agent executing the plan to decide which plan branch to follow. The
decision-steps in a plan result in subgoals to acquire knowledge, which are
planned for in the same way as any other subgoals. Cassandra thus distinguishes
the process of gathering information from the process of making decisions. The
explicit representation of decisions in Cassandra allows a coherent approach to
the problems of contingent planning, and provides a solid base for extensions
such as the use of different decision-making procedures.Comment: See http://www.jair.org/ for any accompanying file
The blind leading the blind: Mutual refinement of approximate theories
The mutual refinement theory, a method for refining world models in a reactive system, is described. The method detects failures, explains their causes, and repairs the approximate models which cause the failures. The approach focuses on using one approximate model to refine another
A Cross-disciplinary Framework for the Description of Contextually Mediated Change
We present a mathematical framework (referred to as Context-driven
Actualization of Potential, or CAP) for describing how entities change over
time under the influence of a context. The approach facilitates comparison of
change of state of entities studied in different disciplines. Processes are
seen to differ according to the degree of nondeterminism, and the degree to
which they are sensitive to, internalize, and depend upon a particular context.
Our analysis suggests that the dynamical evolution of a quantum entity
described by the Schrodinger equation is not fundamentally different from
change provoked by a measurement often referred to as collapse, but a limiting
case, with only one way to collapse. The biological transition to coded
replication is seen as a means of preserving structure in the fact of
context-driven change, and sextual replication as a means of increasing
potentiality thus enhancing diversity through interaction with context. The
framework sheds light on concepts like selection and fitness, reveals how
exceptional Darwinian evolution is as a means of 'change of state', and
clarifies in what sense culture, and the creative process underlying it, are
Darwinian.Comment: 19 pages. arXiv admin note: substantial text overlap with
arXiv:q-bio/051100
Contextualizing concepts using a mathematical generalization of the quantum formalism
We outline the rationale and preliminary results of using the State Context
Property (SCOP) formalism, originally developed as
a generalization of quantum mechanics, to describe the contextual manner in
which concepts are evoked, used, and combined to
generate meaning. The quantum formalism was developed to cope with problems
arising in the description of (1) the measurement
process, and (2) the generation of new states with new properties when
particles become entangled. Similar problems arising
with concepts motivated the formal treatment introduced here. Concepts are
viewed not as fixed representations, but entities
existing in states of potentiality that require interaction with a
context---a stimulus or another concept---to `collapse' to
observable form as an exemplar, prototype, or other (possibly imaginary)
instance. The stimulus situation plays the role of
the measurement in physics, acting as context that induces a change of the
cognitive state from
superposition state to collapsed state. The collapsed state is
more likely to consist of a conjunction of
concepts for associative than analytic thought because more stimulus or
concept properties take part in the
collapse. We provide two contextual measures of conceptual distance---one
using collapse probabilities and the other weighted
properties---and show how they can be applied to conjunctions using the pet
fish problem
Recommended from our members
Modelling the evolution of biological complexity with a two-dimensional lattice self-assembly process
Self-assembling systems are prevalent across numerous scales of nature, lying at the heart of diverse physical and biological phenomena.
Individual protein subunits self-assembling into complexes is often a vital first step of biological processes.
Errors during protein assembly, due to mutations or misfolds, can have devastating effects and are responsible for an assortment of protein diseases, known as proteopathies.
With proteins exhibiting endless layers of complexity, building any all-encompassing model is unrealistic.
Coarse-grained models, despite not faithfully capturing every detail of the original system, have massive potential to assist understanding complex phenomenon.
A principal actor in self-assembly is the binding interactions between subunits, and so geometric constraints, polarity, kinetic forces, etc. can often be marginalised.
This work explores how self-assembly and its outcomes are inextricably tied to the involved interactions through the use of a two-dimensional lattice polyomino model.
%Armed with this tractable model, we can probe how dynamics acting on evolution are reflected in interaction properties.
First, this thesis addresses how the interaction characteristics of self-assembly building blocks determine what structures they form.
Specifically, if the same structures are consistently produced and remain finite in size.
Assembly graphs store subunit interaction information and are used in classifying these two properties, the determinism and boundedness respectively.
Arbitrary sets of building blocks are classified without the costly overhead of repeated stochastic assembling, improving both the analysis speed and accuracy.
Furthermore, assembly graphs naturally integrate combinatorial and graph techniques, enabling a wider range of future polyomino studies.
The second part narrows in on implications of nondeterministic assembly on interaction strength evolution.
Generalising subunit binding sites with mutable binary strings introduces such interaction strengths into the polyomino model.
Deterministic assemblies obey analytic expectations.
Conversely, interactions in nondeterministic assemblies rapidly diverge from equilibrium to minimise assembly inconsistency.
Optimal interaction strengths during assembly are also reflected in evolution.
Transitions between certain polyominoes are strongly forbidden when interaction strengths are misaligned.
The third aspect focuses on genetic duplication, an evolutionary event observed in organisms across all taxa.
Through polyomino evolutions, a duplication-heteromerisation pathway emerges as an efficient process.
This pathway exploits the advantages of both self-interactions and pairwise-interactions, and accelerates evolution by avoiding complexity bottlenecks.
Several simulation predictions are successfully validated against a large data set of protein complexes.
These results focus on coarse-grained models rather than quantified biological insight.
Despite this, they reinforce existing observations of protein complexes, as well as posing several new mechanisms for the evolution of biological complexity
Communication Theoretic Data Analytics
Widespread use of the Internet and social networks invokes the generation of
big data, which is proving to be useful in a number of applications. To deal
with explosively growing amounts of data, data analytics has emerged as a
critical technology related to computing, signal processing, and information
networking. In this paper, a formalism is considered in which data is modeled
as a generalized social network and communication theory and information theory
are thereby extended to data analytics. First, the creation of an equalizer to
optimize information transfer between two data variables is considered, and
financial data is used to demonstrate the advantages. Then, an information
coupling approach based on information geometry is applied for dimensionality
reduction, with a pattern recognition example to illustrate the effectiveness.
These initial trials suggest the potential of communication theoretic data
analytics for a wide range of applications.Comment: Published in IEEE Journal on Selected Areas in Communications, Jan.
201
- …