100 research outputs found
LIPIcs, Volume 261, ICALP 2023, Complete Volume
LIPIcs, Volume 261, ICALP 2023, Complete Volum
Measure-theoretic semantics for quantitative parity automata
Quantitative parity automata (QPAs) generalise non-deterministic parity automata (NPAs) by adding weights from a certain semiring to transitions. QPAs run on infinite word/tree-like structures, modelled as coalgebras of a polynomial functor F. They can also arise as certain products between a quantitative model (with branching modelled via the same semiring of quantities, and linear behaviour described by the functor F) and an NPA (modelling a qualitative property of F-coalgebras). We build on recent work on semiring-valued measures to define a way to measure the set of paths through a quantitative branching model which satisfy a qualitative property (captured by an unambiguous NPA running on F-coalgebras). Our main result shows that the notion of extent of a QPA (which generalises non-emptiness of an NPA, and is defined as the solution of a nested system of equations) provides an equivalent characterisation of the measure of the accepting paths through the QPA. This result makes recently-developed methods for computing nested fixpoints available for model checking qualitative, linear-time properties against quantitative branching models
Measure-Theoretic Semantics for Quantitative Parity Automata
Quantitative parity automata (QPAs) generalise non-deterministic parity automata (NPAs) by adding weights from a certain semiring to transitions. QPAs run on infinite word/tree-like structures, modelled as coalgebras of a polynomial functor F. They can also arise as certain products between a quantitative model (with branching modelled via the same semiring of quantities, and linear behaviour described by the functor F) and an NPA (modelling a qualitative property of F-coalgebras). We build on recent work on semiring-valued measures to define a way to measure the set of paths through a quantitative branching model which satisfy a qualitative property (captured by an unambiguous NPA running on F-coalgebras). Our main result shows that the notion of extent of a QPA (which generalises non-emptiness of an NPA, and is defined as the solution of a nested system of equations) provides an equivalent characterisation of the measure of the accepting paths through the QPA. This result makes recently-developed methods for computing nested fixpoints available for model checking qualitative, linear-time properties against quantitative branching models
Categorical Modelling of Logic Programming: Coalgebra, Functorial Semantics, String Diagrams
Logic programming (LP) is driven by the idea that logic subsumes computation. Over the
past 50 years, along with the emergence of numerous logic systems, LP has also grown into a
large family, the members of which are designed to deal with various computation scenarios.
Among them, we focus on two of the most influential quantitative variants are probabilistic
logic programming (PLP) and weighted logic programming (WLP).
In this thesis, we investigate a uniform understanding of logic programming and its quan-
titative variants from the perspective of category theory. In particular, we explore both a
coalgebraic and an algebraic understanding of LP, PLP and WLP.
On the coalgebraic side, we propose a goal-directed strategy for calculating the probabilities
and weights of atoms in PLP and WLP programs, respectively. We then develop a coalgebraic
semantics for PLP and WLP, built on existing coalgebraic semantics for LP. By choosing
the appropriate functors representing probabilistic and weighted computation, such coalgeraic
semantics characterise exactly the goal-directed behaviour of PLP and WLP programs.
On the algebraic side, we define a functorial semantics of LP, PLP, and WLP, such that they
three share the same syntactic categories of string diagrams, and differ regarding to the semantic
categories according to their data/computation type. This allows for a uniform diagrammatic
expression for certain semantic constructs. Moreover, based on similar approaches to Bayesian
networks, this provides a framework to formalise the connection between PLP and Bayesian
networks. Furthermore, we prove a sound and complete aximatization of the semantic category
for LP, in terms of string diagrams. Together with the diagrammatic presentation of the
fixed point semantics, one obtain a decidable calculus for proving the equivalence between
propositional definite logic programs
Quantum Bisimilarity via Barbs and Contexts: Curbing the Power of Non-Deterministic Observers
Past years have seen the development of a few proposals for quantum
extensions of process calculi. The rationale is clear: with the development of
quantum communication protocols, there is a need to abstract and focus on the
basic features of quantum concurrent systems, like CCS has done for its
classical counterpart. So far, though, no accepted standard has emerged,
neither for the syntax nor for the behavioural semantics. Indeed, the various
proposals do not agree on what should be the observational properties of
quantum values, and as a matter of fact, the soundness of such properties has
never been validated against the prescriptions of quantum theory.
To this aim, we introduce a new calculus, Linear Quantum CCS, and investigate
the features of behavioural equivalences based on barbs and contexts. Our
calculus can be thought of as an asynchronous, linear version of qCCS (based on
value-passing CCS). Linearity ensures that each qubit is sent exactly once,
precisely specifying which qubits of a process interact with the context.
We exploit contexts to examine how bisimilarities relate to quantum theory.
We show that the observational power of general contexts is incompatible with
quantum theory: roughly, they can perform non-deterministic moves depending on
quantum values without measuring (hence perturbing) them.
Therefore, we refine the operational semantics in order to prevent contexts
from performing unfeasible non-deterministic choices. This induces a coarser
bisimilarity that better fits the quantum setting: (i) it lifts the
indistinguishability of quantum states to the distributions of processes and,
despite the additional constraints, (ii) it preserves the expressivity of
non-deterministic choices based on classical information. To the best of our
knowledge, our semantics is the first one that satisfies the two properties
above.Comment: This is the extended version of the POPL2024 paper "Quantum
Bisimilarity via Barbs and Contexts: Curbing the Power of Non-Deterministic
Observers
Probabilistic Guarded KAT Modulo Bisimilarity: Completeness and Complexity
We introduce Probabilistic Guarded Kleene Algebra with Tests (ProbGKAT), an extension of GKAT that allows reasoning about uninterpreted imperative programs with probabilistic branching. We give its operational semantics in terms of special class of probabilistic automata. We give a sound and complete Salomaa-style axiomatisation of bisimilarity of ProbGKAT expressions. Finally, we show that bisimilarity of ProbGKAT expressions can be decided in O(n3 log n) time via a generic partition refinement algorithm
Bisimulations for Kripke models of Fuzzy Multimodal Logics
The main objective of the dissertation is to provide a detailed study of several different types of simulations and
bisimulations for Kripke models of fuzzy multimodal logics. Two types of simulations (forward and backward)
and five types of bisimulations (forward, backward, forward-backward, backward-forward and regular) are presented
hereby. For each type of simulation and bisimulation, an algorithm is created to test the existence of the simulation
or bisimulation and, if it exists, the algorithm computes the greatest one. The dissertation presents the application of
bisimulations in the state reduction of fuzzy Kripke models, while preserving their semantic properties. Next, weak simulations and bisimulations were considered and the Hennessy-Milner property was examined. Finally, an algorithm was created to compute weak simulations and bisimulations for fuzzy Kripke models over locally finite algebras
Coalgebra for the working software engineer
Often referred to as ‘the mathematics of dynamical, state-based systems’, Coalgebra claims to provide a compositional and uniform framework to spec ify, analyse and reason about state and behaviour in computing. This paper addresses this claim by discussing why Coalgebra matters for the design of models and logics for computational phenomena. To a great extent, in this domain one is interested in properties that are preserved along the system’s evolution, the so-called ‘business rules’ or system’s invariants, as well as in liveness requirements, stating that e.g. some desirable outcome will be eventually produced. Both classes are examples of modal assertions, i.e. properties that are to be interpreted across a transition system capturing the system’s dynamics. The relevance of modal reasoning in computing is witnessed by the fact that most university syllabi in the area include some incursion into modal logic, in particular in its temporal variants. The novelty is that, as it happens with the notions of transition, behaviour, or observational equivalence, modalities in Coalgebra acquire a shape . That is, they become parametric on whatever type of behaviour, and corresponding coinduction scheme, seems appropriate for addressing the problem at hand. In this context, the paper revisits Coalgebra from a computational perspective, focussing on three topics central to software design: how systems are modelled, how models are composed, and finally, how properties of their behaviours can be expressed and verified.Fuzziness, as a way to express imprecision, or uncertainty, in computation is an important feature in a number of current application scenarios: from hybrid systems interfacing with sensor networks with error boundaries, to knowledge bases collecting data from often non-coincident human experts. Their abstraction in e.g. fuzzy transition systems led to a number of mathematical structures to model this sort of systems and reason about them. This paper adds two more elements to this family: two modal logics, framed as institutions, to reason about fuzzy transition systems and the corresponding processes. This paves the way to the development, in the second part of the paper, of an associated theory of structured specification for fuzzy computational systems
- …