128 research outputs found
Applications of Metric Coinduction
Metric coinduction is a form of coinduction that can be used to establish
properties of objects constructed as a limit of finite approximations. One can
prove a coinduction step showing that some property is preserved by one step of
the approximation process, then automatically infer by the coinduction
principle that the property holds of the limit object. This can often be used
to avoid complicated analytic arguments involving limits and convergence,
replacing them with simpler algebraic arguments. This paper examines the
application of this principle in a variety of areas, including infinite
streams, Markov chains, Markov decision processes, and non-well-founded sets.
These results point to the usefulness of coinduction as a general proof
technique
Approximate reasoning for real-time probabilistic processes
We develop a pseudo-metric analogue of bisimulation for generalized
semi-Markov processes. The kernel of this pseudo-metric corresponds to
bisimulation; thus we have extended bisimulation for continuous-time
probabilistic processes to a much broader class of distributions than
exponential distributions. This pseudo-metric gives a useful handle on
approximate reasoning in the presence of numerical information -- such as
probabilities and time -- in the model. We give a fixed point characterization
of the pseudo-metric. This makes available coinductive reasoning principles for
reasoning about distances. We demonstrate that our approach is insensitive to
potentially ad hoc articulations of distance by showing that it is intrinsic to
an underlying uniformity. We provide a logical characterization of this
uniformity using a real-valued modal logic. We show that several quantitative
properties of interest are continuous with respect to the pseudo-metric. Thus,
if two processes are metrically close, then observable quantitative properties
of interest are indeed close.Comment: Preliminary version appeared in QEST 0
Almost Sure Productivity
We define Almost Sure Productivity (ASP), a probabilistic generalization of
the productivity condition for coinductively defined structures. Intuitively, a
probabilistic coinductive stream or tree is ASP if it produces infinitely many
outputs with probability 1. Formally, we define almost sure productivity using
a final coalgebra semantics of programs inspired from Kerstan and K\"onig.
Then, we introduce a core language for probabilistic streams and trees, and
provide two approaches to verify ASP: a sufficient syntactic criterion, and a
reduction to model-checking pCTL* formulas on probabilistic pushdown automata.
The reduction shows that ASP is decidable for our core language
On Probabilistic Applicative Bisimulation and Call-by-Value -Calculi (Long Version)
Probabilistic applicative bisimulation is a recently introduced coinductive
methodology for program equivalence in a probabilistic, higher-order, setting.
In this paper, the technique is applied to a typed, call-by-value,
lambda-calculus. Surprisingly, the obtained relation coincides with context
equivalence, contrary to what happens when call-by-name evaluation is
considered. Even more surprisingly, full-abstraction only holds in a symmetric
setting.Comment: 30 page
On polymorphic sessions and functions: A tale of two (fully abstract) encodings
This work exploits the logical foundation of session types to determine what kind of type discipline for the -calculus can exactly capture, and is captured by, -calculus behaviours. Leveraging the proof theoretic content of the soundness and completeness of sequent calculus and natural deduction presentations of linear logic, we develop the first mutually inverse and fully abstract processes-as-functions and functions-as-processes encodings between a polymorphic session -calculus and a linear formulation of System F. We are then able to derive results of the session calculus from the theory of the -calculus: (1) we obtain a characterisation of inductive and coinductive session types via their algebraic representations in System F; and (2) we extend our results to account for value and process passing, entailing strong normalisation
The Power of Convex Algebras
Probabilistic automata (PA) combine probability and nondeterminism. They can
be given different semantics, like strong bisimilarity, convex bisimilarity, or
(more recently) distribution bisimilarity. The latter is based on the view of
PA as transformers of probability distributions, also called belief states, and
promotes distributions to first-class citizens.
We give a coalgebraic account of the latter semantics, and explain the
genesis of the belief-state transformer from a PA. To do so, we make explicit
the convex algebraic structure present in PA and identify belief-state
transformers as transition systems with state space that carries a convex
algebra. As a consequence of our abstract approach, we can give a sound proof
technique which we call bisimulation up-to convex hull.Comment: Full (extended) version of a CONCUR 2017 paper, to be submitted to
LMC
FICS 2010
International audienceInformal proceedings of the 7th workshop on Fixed Points in Computer Science (FICS 2010), held in Brno, 21-22 August 201
Fixed-Points for Quantitative Equational Logics
We develop a fixed-point extension of quantitative equational logic and give semantics in one-bounded complete quantitative algebras. Unlike previous related work about fixed-points in metric spaces, we are working with the notion of approximate equality rather than exact equality. The result is a novel theory of fixed points which can not only provide solutions to the traditional fixed-point equations but we can also define the rate of convergence to the fixed point. We show that such a theory is the quantitative analogue of a Conway theory and also of an iteration theory; and it reflects the metric coinduction principle. We study the Bellman equation for a Markov decision process as an illustrative example
Retracing some paths in categorical semantics: From process-propositions-as-types to categorified reals and computers
The logical parallelism of propositional connectives and type constructors
extends beyond the static realm of predicates, to the dynamic realm of
processes. Understanding the logical parallelism of process propositions and
dynamic types was one of the central problems of the semantics of computation,
albeit not always clear or explicit. It sprung into clarity through the early
work of Samson Abramsky, where the central ideas of denotational semantics and
process calculus were brought together and analyzed by categorical tools, e.g.
in the structure of interaction categories. While some logical structures borne
of dynamics of computation immediately started to emerge, others had to wait,
be it because the underlying logical principles (mainly those arising from
coinduction) were not yet sufficiently well-understood, or simply because the
research community was more interested in other semantical tasks. Looking back,
it seems that the process logic uncovered by those early semantical efforts
might still be starting to emerge and that the vast field of results that have
been obtained in the meantime might be a valley on a tip of an iceberg.
In the present paper, I try to provide a logical overview of the gamut of
interaction categories and to distinguish those that model computation from
those that capture processes in general. The main coinductive constructions
turn out to be of this latter kind, as illustrated towards the end of the paper
by a compact category of all real numbers as processes, computable and
uncomputable, with polarized bisimulations as morphisms. The addition of the
reals arises as the biproduct, real vector spaces are the enriched
bicompletions, and linear algebra arises from the enriched kan extensions. At
the final step, I sketch a structure that characterizes the computable fragment
of categorical semantics.Comment: 63 pages, 40 figures; cut two words from the title, tried to improve
(without lengthening) Sec.8; rewrote a proof in the Appendi
- …