31 research outputs found

    Combinatorial Conversion and Moment Bisimulation for Stochastic Rewriting Systems

    Get PDF
    We develop a novel method to analyze the dynamics of stochastic rewriting systems evolving over finitary adhesive, extensive categories. Our formalism is based on the so-called rule algebra framework and exhibits an intimate relationship between the combinatorics of the rewriting rules (as encoded in the rule algebra) and the dynamics which these rules generate on observables (as encoded in the stochastic mechanics formalism). We introduce the concept of combinatorial conversion, whereby under certain technical conditions the evolution equation for (the exponential generating function of) the statistical moments of observables can be expressed as the action of certain differential operators on formal power series. This permits us to formulate the novel concept of moment-bisimulation, whereby two dynamical systems are compared in terms of their evolution of sets of observables that are in bijection. In particular, we exhibit non-trivial examples of graphical rewriting systems that are moment-bisimilar to certain discrete rewriting systems (such as branching processes or the larger class of stochastic chemical reaction systems). Our results point towards applications of a vast number of existing well-established exact and approximate analysis techniques developed for chemical reaction systems to the far richer class of general stochastic rewriting systems

    Role of the inter‐ply microstructure in the consolidation quality of high‐performance thermoplastic composites

    Get PDF
    Consolidation of Carbon Fiber (CF)/high-performance thermoplastic compos-ites is much less understood than the one of their thermoset counterparts. It isusually assumed that the consolidation quality is directly linked to the removalof voids within the sample during consolidation, leading to mechanical proper-ties suitable for aerospace applications. A systematic study of the temporalevolution of CF/polyetherketoneketone (PEKK) samples' microstructure con-solidated under low pressure in a rheometer is related to the increase in inter-laminar shear strength. The results show that despite similar void contentswell-below 1 vol%, samples can present significant differences in ILSS values,from 80 to 95 MPa for cross-ply samples, and from 98 to 112 MPa for unidirec-tional (UD) ones. A microstructural analysis shows that, for these materials,consolidation quality is rather related to a reorganization of the inter-ply, aresin-rich ( 70 vol%) region of typical thickness 10ÎŒm which is slowly repo-pulated in fibers during consolidation

    Stochastic mechanics of graph rewriting

    Get PDF
    International audienceWe propose an algebraic approach to stochastic graph-rewriting which extends the classical construction of the Heisenberg-Weyl algebra and its canonical representation on the Fock space. Rules are seen as particular elements of an algebra of 'diagrams' (the diagram algebra D). Diagrams can be thought of as formal computational traces represented in partial time. They span a vector space which carries a natural filtered Hopf algebra structure. Diagrams can be evaluated to normal diagrams (each corresponding to a rule) and generate an associative unital (non-commutative) ˚-algebra of rules (the rule algebra R). Evaluation becomes a morphism of uni-tal associative algebras which maps general diagrams in D to normal ones in R. In this algebraic reformulation, usual distinctions between graph observables (real-valued maps on the set of graphs defined by counting subgraphs), and rules disappear. Instead, natural algebraic substructures of R arise: formal observables are seen as rules with equal left and right hand sides and form a commutative subalgebra, the ones counting subgraphs forming a sub-subalgebra of identity rules. Actual graph-rewriting (of the DPO type) is recovered as a canonical representation of the rule algebra as linear operators over the vector field generated by (isomorphism classes of) finite graphs. The construction of the representation is in close analogy and subsumes the classical (multi-type bosonic) Fock space representation of the Heisenberg-Weyl algebra. This subtle shift of point of view (away from its canonical representation to the rule algebra itself) has far-reaching and unexpected consequences. We find that natural variants of the evaluation mor-phism map give rise to concepts of graph transformations hitherto not considered (these will be described in a separate paper, as in this extended abstract we limit ourselves to the simplest concept namely that of DPO-rewriting). We prove very simply a DPO version of the jump-closure theorem, namely that the sub-space of representations of formal graph observables closed under the action of any rule set. From this new jump-closure result follows that for any set of rules R, one can derive a formal and self-consistent Kolmogorov backward equation for (representations) of formal observables

    Combinatorial Conversion and Moment Bisimulation for Stochastic Rewriting Systems

    Get PDF
    We develop a novel method to analyze the dynamics of stochastic rewriting systems evolving over finitary adhesive, extensive categories. Our formalism is based on the so-called rule algebra framework and exhibits an intimate relationship between the combinatorics of the rewriting rules (as encoded in the rule algebra) and the dynamics which these rules generate on observables (as encoded in the stochastic mechanics formalism). We introduce the concept of combinatorial conversion, whereby under certain technical conditions the evolution equation for (the exponential generating function of) the statistical moments of observables can be expressed as the action of certain differential operators on formal power series. This permits us to formulate the novel concept of moment-bisimulation, whereby two dynamical systems are compared in terms of their evolution of sets of observables that are in bijection. In particular, we exhibit non-trivial examples of graphical rewriting systems that are moment-bisimilar to certain discrete rewriting systems (such as branching processes or the larger class of stochastic chemical reaction systems). Our results point towards applications of a vast number of existing well-established exact and approximate analysis techniques developed for chemical reaction systems to the far richer class of general stochastic rewriting systems

    Robustly Parameterised Higher-Order Probabilistic Models

    Get PDF
    We present a method for constructing robustly parameterised families of higher-order probabilistic models. Parameter spaces and models are represented by certain classes of functors in the category of Polish spaces. Maps from parameter spaces to models (parameterisations) are continuous and natural transformations between such functors. Naturality ensures that parameterised models are invariant by change of granularity -- ie that parameterisations are intrinsic. Continuity ensures that models are robust with respect to their parameterisation. Our method allows one to build models from a set of basic functors among which the Giry probabilistic functor, spaces of cadlag trajectories (in continuous and discrete time), multisets and compact powersets. These functors can be combined by guarded composition, product and coproduct. Parameter spaces range over the polynomial closure of Giry-like functors. Thus we obtain a class of robust parameterised models which includes the Dirichlet process, various point processes (random sequences with values in Polish spaces) and other classical objects of probability theory. By extending techniques developed in prior work, we show how to reduce the questions of existence, uniqueness, naturality, and continuity of a parameterised model to combinatorial questions only involving finite spaces

    Borel Kernels and their Approximation, Categorically

    Get PDF
    This paper introduces a categorical framework to study the exact and approximate semantics of probabilistic programs. We construct a dagger symmetric monoidal category of Borel kernels where the dagger-structure is given by Bayesian inversion. We show functorial bridges between this category and categories of Banach lattices which formalize the move from kernel-based semantics to predicate transformer (backward) or state transformer (forward) semantics. These bridges are related by natural transformations, and we show in particular that the Radon-Nikodym and Riesz representation theorems - two pillars of probability theory - define natural transformations. With the mathematical infrastructure in place, we present a generic and endogenous approach to approximating kernels on standard Borel spaces which exploits the involutive structure of our category of kernels. The approximation can be formulated in several equivalent ways by using the functorial bridges and natural transformations described above. Finally, we show that for sensible discretization schemes, every Borel kernel can be approximated by kernels on finite spaces, and that these approximations converge for a natural choice of topology. We illustrate the theory by showing two examples of how approximation can effectively be used in practice: Bayesian inference and the Kleene star operation of ProbNetKAT.Comment: 17 pages, 4 figure

    Pointless learning

    Get PDF
    Bayesian inversion is at the heart of probabilistic programming and more generally machine learning. Understanding inversion is made difficult by the pointful (kernel-centric) point of view usually taken in the literature. We develop a pointless (kernel-free) approach to inversion. While doing so, we revisit some foundational objects of probability theory, unravel their category-theoretical underpinnings and show how pointless Bayesian inversion sits naturally at the centre of this construction

    Bayesian Inversion by Omega-Complete Cone Duality (Invited Paper)

    Get PDF
    The process of inverting Markov kernels relates to the important subject of Bayesian modelling and learning. In fact, Bayesian update is exactly kernel inversion. In this paper, we investigate how and when Markov kernels (aka stochastic relations, or probabilistic mappings, or simply kernels) can be inverted. We address the question both directly on the category of measurable spaces, and indirectly by interpreting kernels as Markov operators: - For the direct option, we introduce a typed version of the category of Markov kernels and use the so-called "disintegration of measures". Here, one has to specialise to measurable spaces borne from a simple class of topological spaces -e.g. Polish spaces (other choices are possible). Our method and result greatly simplify a recent development in Ref. [4]. - For the operator option, we use a cone version of the category of Markov operators (kernels seen as predicate transformers). That is to say, our linear operators are not just continuous, but are required to satisfy the stronger condition of being omom-chain-continuous. Prior work shows that one obtains an adjunction in the form of a pair of contravariant and inverse functors between the categories of L1L_1- and LinftyL_infty-cones [3]. Inversion, seen through the operator prism, is just adjunction. No topological assumption is needed. - We show that both categories (Markov kernels and omom-chain-continuous Markov operators) are related by a family of contravariant functors TpT_p for 1leqpleqinfty1leq pleqinfty. The TpT_p\u27s are Kleisli extensions of (duals of) conditional expectation functors introduced in Ref. [3]. - With this bridge in place, we can prove that both notions of inversion agree when both defined: if ff is a kernel, and fdgfdg its direct inverse, then Tinfty(f)dg=T1(fdg)T_infty(f)dg=T_1(fdg)

    Pointless learning (long version)

    Get PDF
    International audienceBayesian inversion is at the heart of probabilistic programming and more generally machine learning. Understanding inversion is made difficult by the pointful (kernel-centric) point of view usually taken in the literature. We develop a pointless (kernel-free) approach to inversion. While doing so, we revisit some foundational objects of probability theory, unravel their category-theoretical underpinnings and show how pointless Bayesian inversion sits naturally at the centre of this construction

    Pointless learning (long version)

    Get PDF
    International audienceBayesian inversion is at the heart of probabilistic programming and more generally machine learning. Understanding inversion is made difficult by the pointful (kernel-centric) point of view usually taken in the literature. We develop a pointless (kernel-free) approach to inversion. While doing so, we revisit some foundational objects of probability theory, unravel their category-theoretical underpinnings and show how pointless Bayesian inversion sits naturally at the centre of this construction
    corecore