12 research outputs found

    Approximate Inference via Fibrations of Statistical Games

    Full text link
    We characterize a number of well known systems of approximate inference as loss models: lax sections of 2-fibrations of statistical games, constructed by attaching internally-defined loss functions to Bayesian lenses. Our examples include the relative entropy, which constitutes a strict section, and whose chain rule is formalized by the horizontal composition of the 2-fibration. In order to capture this compositional structure, we first introduce the notion of `copy-composition', alongside corresponding bicategories through which the composition of copy-discard categories factorizes. These bicategories are a variant of the Copara\mathbf{Copara} construction, and so we additionally introduce coparameterized Bayesian lenses, proving that coparameterized Bayesian updates compose optically, as in the non-coparameterized case.Comment: Accepted as a proceedings paper at ACT 202

    Mathematical Foundations for a Compositional Account of the Bayesian Brain

    Full text link
    This dissertation reports some first steps towards a compositional account of active inference and the Bayesian brain. Specifically, we use the tools of contemporary applied category theory to supply functorial semantics for approximate inference. To do so, we define on the `syntactic' side the new notion of Bayesian lens and show that Bayesian updating composes according to the compositional lens pattern. Using Bayesian lenses, and inspired by compositional game theory, we define fibrations of statistical games and classify various problems of statistical inference as corresponding sections: the chain rule of the relative entropy is formalized as a strict section, while maximum likelihood estimation and the free energy give lax sections. In the process, we introduce a new notion of `copy-composition'. On the `semantic' side, we present a new formalization of general open dynamical systems (particularly: deterministic, stochastic, and random; and discrete- and continuous-time) as certain coalgebras of polynomial functors, which we show collect into monoidal opindexed categories (or, alternatively, into algebras for multicategories of generalized polynomial functors). We use these opindexed categories to define monoidal bicategories of cilia: dynamical systems which control lenses, and which supply the target for our functorial semantics. Accordingly, we construct functors which explain the bidirectional compositional structure of predictive coding neural circuits under the free energy principle, thereby giving a formal mathematical underpinning to the bidirectionality observed in the cortex. Along the way, we explain how to compose rate-coded neural circuits using an algebra for a multicategory of linear circuit diagrams, showing subsequently that this is subsumed by lenses and polynomial functors.Comment: DPhil thesis; as submitted. Main change from v1: improved treatment of statistical games. A number of errors also fixed, and some presentation improved. Comments most welcom

    Open Dynamical Systems as Coalgebras for Polynomial Functors, with Application to Predictive Processing

    Full text link
    We present categories of open dynamical systems with general time evolution as categories of coalgebras opindexed by polynomial interfaces, and show how this extends the coalgebraic framework to capture common scientific applications such as ordinary differential equations, open Markov processes, and random dynamical systems. We then extend Spivak's operad Org to this setting, and construct associated monoidal categories whose morphisms represent hierarchical open systems; when their interfaces are simple, these categories supply canonical comonoid structures. We exemplify these constructions using the 'Laplace doctrine', which provides dynamical semantics for active inference, and indicate some connections to Bayesian inversion and coalgebraic logic.Comment: In Proceedings ACT 2022, arXiv:2307.1551

    Mathematical foundations for a compositional account of the Bayesian brain

    Get PDF
    This dissertation reports some first steps towards a compositional account of active inference and the Bayesian brain. Specifically, we use the tools of contemporary applied category theory to supply functorial semantics for approximate inference. To do so, we define on the 'syntactic' side the new notion of Bayesian lens and show that Bayesian updating composes according to the compositional lens pattern. Using Bayesian lenses, and inspired by compositional game theory, we define fibrations of statistical games and classify various problems of statistical inference as corresponding sections: the chain rule of the relative entropy is formalized as a strict section, while maximum likelihood estimation and the free energy give lax sections. In the process, we introduce a new notion of 'copy-composition'. On the 'semantic' side, we present a new formalization of general open dynamical systems (particularly: deterministic, stochastic, and random; and discrete- and continuous-time) as certain coalgebras of polynomial functors, which we show collect into monoidal opindexed categories (or, alternatively, into algebras for multicategories of generalized polynomial functors). We use these opindexed categories to define monoidal bicategories of 'cilia': dynamical systems which control lenses, and which supply the target for our functorial semantics. Accordingly, we construct functors which explain the bidirectional compositional structure of predictive coding neural circuits under the free energy principle, thereby giving a formal mathematical underpinning to the bidirectionality observed in the cortex. Along the way, we explain how to compose rate-coded neural circuits using an algebra for a multicategory of linear circuit diagrams, showing subsequently that this is subsumed by lenses and polynomial functors. Because category theory is unfamiliar to many computational neuroscientists and cognitive scientists, we have made a particular effort to give clear, detailed, and approachable expositions of all the category-theoretic structures and results of which we make use. We hope that this dissertation will prove helpful in establishing a new "well-typed'' science of life and mind, and in facilitating interdisciplinary communication

    The Compositional Structure of Bayesian Inference

    Get PDF
    Bayes\u27 rule tells us how to invert a causal process in order to update our beliefs in light of new evidence. If the process is believed to have a complex compositional structure, we may observe that the inversion of the whole can be computed piecewise in terms of the component processes. We study the structure of this compositional rule, noting that it relates to the lens pattern in functional programming. Working in a suitably general axiomatic presentation of a category of Markov kernels, we see how we can think of Bayesian inversion as a particular instance of a state-dependent morphism in a fibred category. We discuss the compositional nature of this, formulated as a functor on the underlying category and explore how this can used for a more type-driven approach to statistical inference

    Active Inference in String Diagrams: A Categorical Account of Predictive Processing and Free Energy

    Full text link
    We present a categorical formulation of the cognitive frameworks of Predictive Processing and Active Inference, expressed in terms of string diagrams interpreted in a monoidal category with copying and discarding. This includes diagrammatic accounts of generative models, Bayesian updating, perception, planning, active inference, and free energy. In particular we present a diagrammatic derivation of the formula for active inference via free energy minimisation, and establish a compositionality property for free energy, allowing free energy to be applied at all levels of an agent's generative model. Aside from aiming to provide a helpful graphical language for those familiar with active inference, we conversely hope that this article may provide a concise formulation and introduction to the framework

    The Compositional Structure of Bayesian Inference

    Full text link
    Bayes' rule tells us how to invert a causal process in order to update our beliefs in light of new evidence. If the process is believed to have a complex compositional structure, we may observe that the inversion of the whole can be computed piecewise in terms of the component processes. We study the structure of this compositional rule, noting that it relates to the lens pattern in functional programming. Working in a suitably general axiomatic presentation of a category of Markov kernels, we see how we can think of Bayesian inversion as a particular instance of a state-dependent morphism in a fibred category. We discuss the compositional nature of this, formulated as a functor on the underlying category and explore how this can used for a more type-driven approach to statistical inference.Comment: This paper combines ideas and material from two unpublished preprints, arxiv:2006.01631 and arXiv:2209.1472
    corecore