10 research outputs found
Categorical semantics and composition of tree transducers
In this thesis we see two new approaches to compose tree transducers and more general to fuse functional programs. The first abroach is based on initial algebras. We prove a new variant of the acid rain theorem for mutually recursive functions where the build function is substituted by a concrete functor. Moreover, we give a symmetric form (i.e. consumer and producer have the same syntactic form) of our new acid rain theorem where fusion is composition in a category and thus in particular associative. Applying this to compose top-down tree transducers yields the same result (on a syntactic level) as the classical top-down tree transducer composition. The second approach is based on free monads and monad transformers. In the same way as monoids are used in the theory of character string automata, we use monads in the theory of tree transducers. We generalize the notion of a tree transducer defining the monadic transducer, and we prove an according fusion theorem. Moreover, we prove that homomorphic monadic transducers are semantically equivalent. The latter makes it possible to compose syntactic classes of tree transducers (or particular functional programs) by simply composing endofunctors
List Objects with Algebraic Structure
We introduce and study the notion of list object with algebraic structure. The first key aspect of our development is that the notion of list object is
considered in the context of monoidal structure; the second key aspect is that we further equip list objects with algebraic structure in this setting. Within our framework, we observe that list objects give rise to free monoids and moreover show that this remains so in the presence of algebraic structure. Furthermore, we provide a basic theory explicitly describing as an inductively defined object such free monoids with suitably compatible algebraic structure in common practical situations. This theory is accompanied with the study of two technical themes that,
besides being of interest in their own right, are important for establishing
applications. These themes are: parametrised initiality, central to the universal property defining list objects; and approaches to algebraic structure, in particular in the context of monoidal theories. The latter leads naturally to a notion of nsr (or near semiring)
category of independent interest. With the theoretical development in place, we touch upon a variety of applications, considering Natural Numbers Objects in domain theory, giving a universal property for the monadic list transformer, providing free instances of algebraic extensions of the Haskell Monad type class, elucidating the algebraic character of the construction of opetopes in higher-dimensional algebra, and considering free models of second-order algebraic theories
Lifting of operations in modular monadic semantics
Monads have become a fundamental tool for structuring denotational semantics and programs by abstracting a wide variety of computational features such as side-effects, input/output, exceptions, continuations and non-determinism. In this setting, the notion of a monad is equipped with operations that allow programmers to manipulate these computational effects. For example, a monad for side-effects is equipped with operations for setting and reading the state, and a monad for exceptions is equipped with operations for throwing and handling exceptions.
When several effects are involved, one can employ the incremental approach to mod- ular monadic semantics, which uses monad transformers to build up the desired monad one effect at a time. However, a limitation of this approach is that the effect-manipulating operations need to be manually lifted to the resulting monad, and consequently, the lifted operations are non-uniform. Moreover, the number of liftings needed in a system grows as the product of the number of monad transformers and operations involved.
This dissertation proposes a theory of uniform lifting of operations that extends the incremental approach to modular monadic semantics with a principled technique for lifting operations. Moreover the theory is generalized from monads to monoids in a monoidal category, making it possible to apply it to structures other than monads.
The extended theory is taken to practice with the implementation of a new extensible monad transformer library in Haskell, and with the use of modular monadic semantics to obtain modular operational semantics
Lifting of operations in modular monadic semantics
Monads have become a fundamental tool for structuring denotational semantics and programs by abstracting a wide variety of computational features such as side-effects, input/output, exceptions, continuations and non-determinism. In this setting, the notion of a monad is equipped with operations that allow programmers to manipulate these computational effects. For example, a monad for side-effects is equipped with operations for setting and reading the state, and a monad for exceptions is equipped with operations for throwing and handling exceptions.
When several effects are involved, one can employ the incremental approach to mod- ular monadic semantics, which uses monad transformers to build up the desired monad one effect at a time. However, a limitation of this approach is that the effect-manipulating operations need to be manually lifted to the resulting monad, and consequently, the lifted operations are non-uniform. Moreover, the number of liftings needed in a system grows as the product of the number of monad transformers and operations involved.
This dissertation proposes a theory of uniform lifting of operations that extends the incremental approach to modular monadic semantics with a principled technique for lifting operations. Moreover the theory is generalized from monads to monoids in a monoidal category, making it possible to apply it to structures other than monads.
The extended theory is taken to practice with the implementation of a new extensible monad transformer library in Haskell, and with the use of modular monadic semantics to obtain modular operational semantics
Denotational validation of higher-order Bayesian inference
We present a modular semantic account of Bayesian inference algorithms for
probabilistic programming languages, as used in data science and machine
learning. Sophisticated inference algorithms are often explained in terms of
composition of smaller parts. However, neither their theoretical justification
nor their implementation reflects this modularity. We show how to conceptualise
and analyse such inference algorithms as manipulating intermediate
representations of probabilistic programs using higher-order functions and
inductive types, and their denotational semantics. Semantic accounts of
continuous distributions use measurable spaces. However, our use of
higher-order functions presents a substantial technical difficulty: it is
impossible to define a measurable space structure over the collection of
measurable functions between arbitrary measurable spaces that is compatible
with standard operations on those functions, such as function application. We
overcome this difficulty using quasi-Borel spaces, a recently proposed
mathematical structure that supports both function spaces and continuous
distributions. We define a class of semantic structures for representing
probabilistic programs, and semantic validity criteria for transformations of
these representations in terms of distribution preservation. We develop a
collection of building blocks for composing representations. We use these
building blocks to validate common inference algorithms such as Sequential
Monte Carlo and Markov Chain Monte Carlo. To emphasize the connection between
the semantic manipulation and its traditional measure theoretic origins, we use
Kock's synthetic measure theory. We demonstrate its usefulness by proving a
quasi-Borel counterpart to the Metropolis-Hastings-Green theorem
Recommended from our members
Formally justified and modular Bayesian inference for probabilistic programs
Probabilistic modelling offers a simple and coherent framework to describe the
real world in the face of uncertainty. Furthermore, by applying Bayes' rule
it is possible to use probabilistic models to make inferences about the state of
the world from partial observations. While traditionally probabilistic models
were constructed on paper, more recently the approach of probabilistic
programming enables users to write the models in executable languages resembling
computer programs and to freely mix them with deterministic code.
It has long been recognised that the semantics of programming languages is
complicated and the intuitive understanding that programmers have is often
inaccurate, resulting in difficult to understand bugs and unexpected program
behaviours. Programming languages are therefore studied in a rigorous way using
formal languages with mathematically defined semantics. Traditionally formal
semantics of probabilistic programs are defined using exact inference results,
but in practice exact Bayesian inference is not tractable and approximate
methods are used instead, posing a question of how the results of these
algorithms relate to the exact results. Correctness of such approximate methods
is usually argued somewhat less rigorously, without reference to a formal
semantics.
In this dissertation we formally develop denotational semantics for
probabilistic programs that correspond to popular sampling algorithms often used
in practice. The semantics is defined for an expressive typed lambda calculus
with higher-order functions and inductive types, extended with probabilistic
effects for sampling and conditioning, allowing continuous distributions and
unbounded likelihoods. It makes crucial use of the recently developed formalism
of quasi-Borel spaces to bring all these elements together. We provide semantics
corresponding to several variants of Markov chain Monte Carlo and Sequential
Monte Carlo methods and formally prove a notion of correctness for these
algorithms in the context of probabilistic programming.
We also show that the semantic construction can be directly mapped to an
implementation using established functional programming abstractions called
monad transformers. We develop a compact Haskell library for probabilistic
programming closely corresponding to the semantic construction, giving users a
high level of assurance in the correctness of the implementation. We also
demonstrate on a collection of benchmarks that the library offers performance
competitive with existing systems of similar scope.
An important property of our construction, both the semantics and the
implementation, is the high degree of modularity it offers. All the inference
algorithms are constructed by combining small building blocks in a setup where
the type system ensures correctness of compositions. We show that with basic
building blocks corresponding to vanilla Metropolis-Hastings and Sequential
Monte Carlo we can implement more advanced algorithms known in the literature,
such as Resample-Move Sequential Monte Carlo, Particle Marginal
Metropolis-Hastings, and Sequential Monte Carlo squared. These implementations
are very concise, reducing the effort required to produce them and the scope for
bugs. On top of that, our modular construction enables in some cases
deterministic testing of randomised inference algorithms, further increasing
reliability of the implementation.Engineering and Physical Sciences Research Council, Cambridge Trust, Cambridge-Tuebingen programm
Monad Composition via Preservation of Algebras
Monads are a central object of category theory and constitute crucial tools for many areas of Computer Science, from semantics of computation to functional programming. An important aspect of monads is their correspondence with algebraic theories (their ‘presentation’). As demonstrated by the history of this field, composing monads is a challenging task: the literature contains numerous mistakes and features no general method. One categorical construct, named ‘distributive law’ allows this composition, but its existence is not guaranteed. This thesis addresses the question of monad composition by presenting a method for the construction of distributive laws. For this purpose, we introduce a notion of preservation of algebraic features: considering an arbitrary algebra for the theory presenting a monad S, we examine whether its structure is preserved when applying another monad T . In the case of success, it allows us to construct a distributive law and to compose our monads into T S. In order to develop a general framework, we focus on the class of monoidal monads. If T is monoidal, the algebraic operations presenting S are preserved in a canonical fashion; it remains to examine whether the equations presenting S are also preserved by T . As it turns out, the preservation of an equation depends on the layout of its variables: if each variable appears once on each side, the considered equation is automatically preserved by a monoidal monad. On the other hand, if a variable is duplicated or only appears on one side, preservation is not systematic. The main results of this thesis connect the preservation of such equations with structural properties of monads. In the case where T does not preserve an equation presenting S, our distributive law cannot be built; we provide a series of methods to slightly modify our monads and overcome this issue, and we investigate some less conventional distributive laws. Finally, we consider the presentations of both S and T and revisit our construction of distributive laws, this time with an algebraic point of view. Overall, this thesis presents a general approach to the problem of monad composition by relating categorical properties of monads with preservation of algebras
Foundations of Software Science and Computation Structures
This open access book constitutes the proceedings of the 22nd International Conference on Foundations of Software Science and Computational Structures, FOSSACS 2019, which took place in Prague, Czech Republic, in April 2019, held as part of the European Joint Conference on Theory and Practice of Software, ETAPS 2019. The 29 papers presented in this volume were carefully reviewed and selected from 85 submissions. They deal with foundational research with a clear significance for software science