42 research outputs found

    Logical Aspects of Probability and Quantum Computation

    Get PDF
    Most of the work presented in this document can be read as a sequel to previous work of the author and collaborators, which has been published and appears in [DSZ16, DSZ17, ABdSZ17]. In [ABdSZ17], the mathematical description of quantum homomorphisms of graphs and more generally of relational structures, using the language of category theory is given. In particular, we introduced the concept of ‘quantum’ monad. In this thesis we show that the quantum monad fits nicely into the categorical framework of effectus theory, developed by Jacobs et al. [Jac15, CJWW15]. Effectus theory is an emergent field in categorical logic aiming to describe logic and probability, from the point of view of classical and quantum computation. The main contribution in the first part of this document prove that the Kleisli category of the quantum monad on relational structures is an effectus. The second part is rather different. There, distinct facets of the equivalence relation on graphs called cospectrality are described: algebraic, combinatorial and logical relations are presented as sufficient conditions on graphs for having the same spectrum (i.e. being ‘cospectral’). Other equivalence of graphs (called fractional isomorphism) is also related using some ‘game’ comonads from Abramsky et al. [ADW17, Sha17, AS18]. We also describe a sufficient condition for a pair of graphs to be cospectral using the quantum monad: two Kleisli morphisms (going in opposite directions) between them satisfying certain compatibility requirement

    Categories for Me, and You?

    Get PDF
    A non-self-contained gathering of notes on category theory, including the definition of locally cartesian closed category, of the cartesian structure in slice categories, or of the “pseudo-cartesian structure” on Eilenberg–Moore categories. References and proofs are provided, sometimes, to my knowledge, for the first time

    Neural Nets via Forward State Transformation and Backward Loss Transformation

    Full text link
    This article studies (multilayer perceptron) neural networks with an emphasis on the transformations involved --- both forward and backward --- in order to develop a semantical/logical perspective that is in line with standard program semantics. The common two-pass neural network training algorithms make this viewpoint particularly fitting. In the forward direction, neural networks act as state transformers. In the reverse direction, however, neural networks change losses of outputs to losses of inputs, thereby acting like a (real-valued) predicate transformer. In this way, backpropagation is functorial by construction, as shown earlier in recent other work. We illustrate this perspective by training a simple instance of a neural network

    Categorical Aspects of Parameter Learning

    Full text link
    Parameter learning is the technique for obtaining the probabilistic parameters in conditional probability tables in Bayesian networks from tables with (observed) data --- where it is assumed that the underlying graphical structure is known. There are basically two ways of doing so, referred to as maximal likelihood estimation (MLE) and as Bayesian learning. This paper provides a categorical analysis of these two techniques and describes them in terms of basic properties of the multiset monad M, the distribution monad D and the Giry monad G. In essence, learning is about the reltionships between multisets (used for counting) on the one hand and probability distributions on the other. These relationsips will be described as suitable natural transformations
    corecore