2,651 research outputs found
Causality in concurrent systems
Concurrent systems identify systems, either software, hardware or even
biological systems, that are characterized by sets of independent actions that
can be executed in any order or simultaneously. Computer scientists resort to a
causal terminology to describe and analyse the relations between the actions in
these systems. However, a thorough discussion about the meaning of causality in
such a context has not been developed yet. This paper aims to fill the gap.
First, the paper analyses the notion of causation in concurrent systems and
attempts to build bridges with the existing philosophical literature,
highlighting similarities and divergences between them. Second, the paper
analyses the use of counterfactual reasoning in ex-post analysis in concurrent
systems (i.e. execution trace analysis).Comment: This is an interdisciplinary paper. It addresses a class of causal
models developed in computer science from an epistemic perspective, namely in
terms of philosophy of causalit
Processes, Roles and Their Interactions
Taking an interaction network oriented perspective in informatics raises the
challenge to describe deterministic finite systems which take part in networks
of nondeterministic interactions. The traditional approach to describe
processes as stepwise executable activities which are not based on the
ordinarily nondeterministic interaction shows strong centralization tendencies.
As suggested in this article, viewing processes and their interactions as
complementary can circumvent these centralization tendencies.
The description of both, processes and their interactions is based on the
same building blocks, namely finite input output automata (or transducers).
Processes are viewed as finite systems that take part in multiple, ordinarily
nondeterministic interactions. The interactions between processes are described
as protocols.
The effects of communication between processes as well as the necessary
coordination of different interactions within a processes are both based on the
restriction of the transition relation of product automata. The channel based
outer coupling represents the causal relation between the output and the input
of different systems. The coordination condition based inner coupling
represents the causal relation between the input and output of a single system.
All steps are illustrated with the example of a network of resource
administration processes which is supposed to provide requesting user processes
exclusive access to a single resource.Comment: In Proceedings IWIGP 2012, arXiv:1202.422
A Logic Programming Approach to Knowledge-State Planning: Semantics and Complexity
We propose a new declarative planning language, called K, which is based on
principles and methods of logic programming. In this language, transitions
between states of knowledge can be described, rather than transitions between
completely described states of the world, which makes the language well-suited
for planning under incomplete knowledge. Furthermore, it enables the use of
default principles in the planning process by supporting negation as failure.
Nonetheless, K also supports the representation of transitions between states
of the world (i.e., states of complete knowledge) as a special case, which
shows that the language is very flexible. As we demonstrate on particular
examples, the use of knowledge states may allow for a natural and compact
problem representation. We then provide a thorough analysis of the
computational complexity of K, and consider different planning problems,
including standard planning and secure planning (also known as conformant
planning) problems. We show that these problems have different complexities
under various restrictions, ranging from NP to NEXPTIME in the propositional
case. Our results form the theoretical basis for the DLV^K system, which
implements the language K on top of the DLV logic programming system.Comment: 48 pages, appeared as a Technical Report at KBS of the Vienna
University of Technology, see http://www.kr.tuwien.ac.at/research/reports
Refinement for Probabilistic Systems with Nondeterminism
Before we combine actions and probabilities two very obvious questions should
be asked. Firstly, what does "the probability of an action" mean? Secondly, how
does probability interact with nondeterminism? Neither question has a single
universally agreed upon answer but by considering these questions at the outset
we build a novel and hopefully intuitive probabilistic event-based formalism.
In previous work we have characterised refinement via the notion of testing.
Basically, if one system passes all the tests that another system passes (and
maybe more) we say the first system is a refinement of the second. This is, in
our view, an important way of characterising refinement, via the question "what
sort of refinement should I be using?"
We use testing in this paper as the basis for our refinement. We develop
tests for probabilistic systems by analogy with the tests developed for
non-probabilistic systems. We make sure that our probabilistic tests, when
performed on non-probabilistic automata, give us refinement relations which
agree with for those non-probabilistic automata. We formalise this property as
a vertical refinement.Comment: In Proceedings Refine 2011, arXiv:1106.348
Finding Optimal Flows Efficiently
Among the models of quantum computation, the One-way Quantum Computer is one
of the most promising proposals of physical realization, and opens new
perspectives for parallelization by taking advantage of quantum entanglement.
Since a one-way quantum computation is based on quantum measurement, which is a
fundamentally nondeterministic evolution, a sufficient condition of global
determinism has been introduced as the existence of a causal flow in a graph
that underlies the computation. A O(n^3)-algorithm has been introduced for
finding such a causal flow when the numbers of output and input vertices in the
graph are equal, otherwise no polynomial time algorithm was known for deciding
whether a graph has a causal flow or not. Our main contribution is to introduce
a O(n^2)-algorithm for finding a causal flow, if any, whatever the numbers of
input and output vertices are. This answers the open question stated by Danos
and Kashefi and by de Beaudrap. Moreover, we prove that our algorithm produces
an optimal flow (flow of minimal depth.)
Whereas the existence of a causal flow is a sufficient condition for
determinism, it is not a necessary condition. A weaker version of the causal
flow, called gflow (generalized flow) has been introduced and has been proved
to be a necessary and sufficient condition for a family of deterministic
computations. Moreover the depth of the quantum computation is upper bounded by
the depth of the gflow. However, the existence of a polynomial time algorithm
that finds a gflow has been stated as an open question. In this paper we answer
this positively with a polynomial time algorithm that outputs an optimal gflow
of a given graph and thus finds an optimal correction strategy to the
nondeterministic evolution due to measurements.Comment: 10 pages, 3 figure
Learning Tree Distributions by Hidden Markov Models
Hidden tree Markov models allow learning distributions for tree structured
data while being interpretable as nondeterministic automata. We provide a
concise summary of the main approaches in literature, focusing in particular on
the causality assumptions introduced by the choice of a specific tree visit
direction. We will then sketch a novel non-parametric generalization of the
bottom-up hidden tree Markov model with its interpretation as a
nondeterministic tree automaton with infinite states.Comment: Accepted in LearnAut2018 worksho
Reasoning about real-time systems with temporal interval logic constraints on multi-state automata
Models of real-time systems using a single paradigm often turn out to be inadequate, whether the paradigm is based on states, rules, event sequences, or logic. A model-based approach to reasoning about real-time systems is presented in which a temporal interval logic called TIL is employed to define constraints on a new type of high level automata. The combination, called hierarchical multi-state (HMS) machines, can be used to model formally a real-time system, a dynamic set of requirements, the environment, heuristic knowledge about planning-related problem solving, and the computational states of the reasoning mechanism. In this framework, mathematical techniques were developed for: (1) proving the correctness of a representation; (2) planning of concurrent tasks to achieve goals; and (3) scheduling of plans to satisfy complex temporal constraints. HMS machines allow reasoning about a real-time system from a model of how truth arises instead of merely depending of what is true in a system
Logic-Based Specification Languages for Intelligent Software Agents
The research field of Agent-Oriented Software Engineering (AOSE) aims to find
abstractions, languages, methodologies and toolkits for modeling, verifying,
validating and prototyping complex applications conceptualized as Multiagent
Systems (MASs). A very lively research sub-field studies how formal methods can
be used for AOSE. This paper presents a detailed survey of six logic-based
executable agent specification languages that have been chosen for their
potential to be integrated in our ARPEGGIO project, an open framework for
specifying and prototyping a MAS. The six languages are ConGoLog, Agent-0, the
IMPACT agent programming language, DyLog, Concurrent METATEM and Ehhf. For each
executable language, the logic foundations are described and an example of use
is shown. A comparison of the six languages and a survey of similar approaches
complete the paper, together with considerations of the advantages of using
logic-based languages in MAS modeling and prototyping.Comment: 67 pages, 1 table, 1 figure. Accepted for publication by the Journal
"Theory and Practice of Logic Programming", volume 4, Maurice Bruynooghe
Editor-in-Chie
- …