794 research outputs found

    Stochastic order on metric spaces and the ordered Kantorovich monad

    Full text link
    In earlier work, we had introduced the Kantorovich probability monad on complete metric spaces, extending a construction due to van Breugel. Here we extend the Kantorovich monad further to a certain class of ordered metric spaces, by endowing the spaces of probability measures with the usual stochastic order. It can be considered a metric analogue of the probabilistic powerdomain. The spaces we consider, which we call L-ordered, are spaces where the order satisfies a mild compatibility condition with the metric itself, rather than merely with the underlying topology. As we show, this is related to the theory of Lawvere metric spaces, in which the partial order structure is induced by the zero distances. We show that the algebras of the ordered Kantorovich monad are the closed convex subsets of Banach spaces equipped with a closed positive cone, with algebra morphisms given by the short and monotone affine maps. Considering the category of L-ordered metric spaces as a locally posetal 2-category, the lax and oplax algebra morphisms are exactly the concave and convex short maps, respectively. In the unordered case, we had identified the Wasserstein space as the colimit of the spaces of empirical distributions of finite sequences. We prove that this extends to the ordered setting as well by showing that the stochastic order arises by completing the order between the finite sequences, generalizing a recent result of Lawson. The proof holds on any metric space equipped with a closed partial order.Comment: 49 pages. Removed incorrect statement (Theorem 6.1.10 of previous version

    Monads, partial evaluations, and rewriting

    Full text link
    Monads can be interpreted as encoding formal expressions, or formal operations in the sense of universal algebra. We give a construction which formalizes the idea of "evaluating an expression partially": for example, "2+3" can be obtained as a partial evaluation of "2+2+1". This construction can be given for any monad, and it is linked to the famous bar construction, of which it gives an operational interpretation: the bar construction induces a simplicial set, and its 1-cells are partial evaluations. We study the properties of partial evaluations for general monads. We prove that whenever the monad is weakly cartesian, partial evaluations can be composed via the usual Kan filler property of simplicial sets, of which we give an interpretation in terms of substitution of terms. In terms of rewritings, partial evaluations give an abstract reduction system which is reflexive, confluent, and transitive whenever the monad is weakly cartesian. For the case of probability monads, partial evaluations correspond to what probabilists call conditional expectation of random variables. This manuscript is part of a work in progress on a general rewriting interpretation of the bar construction.Comment: Originally written for the ACT Adjoint School 2019. To appear in Proceedings of MFPS 202

    Lifting couplings in Wasserstein spaces

    Full text link
    This paper makes mathematically precise the idea that conditional probabilities are analogous to path liftings in geometry. The idea of lifting is modelled in terms of the category-theoretic concept of a lens, which can be interpreted as a consistent choice of arrow liftings. The category we study is the one of probability measures over a given standard Borel space, with morphisms given by the couplings, or transport plans. The geometrical picture is even more apparent once we equip the arrows of the category with weights, which one can interpret as "lengths" or "costs", forming a so-called weighted category, which unifies several concepts of category theory and metric geometry. Indeed, we show that the weighted version of a lens is tightly connected to the notion of submetry in geometry. Every weighted category gives rise to a pseudo-quasimetric space via optimization over the arrows. In particular, Wasserstein spaces can be obtained from the weighted categories of probability measures and their couplings, with the weight of a coupling given by its cost. In this case, conditionals allow one to form weighted lenses, which one can interpret as "lifting transport plans, while preserving their cost".Comment: 27 page

    Markov Categories and Entropy

    Full text link
    Markov categories are a novel framework to describe and treat problems in probability and information theory. In this work we combine the categorical formalism with the traditional quantitative notions of entropy, mutual information, and data processing inequalities. We show that several quantitative aspects of information theory can be captured by an enriched version of Markov categories, where the spaces of morphisms are equipped with a divergence or even a metric. As it is customary in information theory, mutual information can be defined as a measure of how far a joint source is from displaying independence of its components. More strikingly, Markov categories give a notion of determinism for sources and channels, and we can define entropy exactly by measuring how far a source or channel is from being deterministic. This recovers Shannon and R\'enyi entropies, as well as the Gini-Simpson index used in ecology to quantify diversity, and it can be used to give a conceptual definition of generalized entropy.Comment: 54 page
    corecore