468 research outputs found

    A categorical characterization of relative entropy on standard Borel spaces

    Get PDF
    We give a categorical treatment, in the spirit of Baez and Fritz, of relative entropy for probability distributions defined on standard Borel spaces. We define a category suitable for reasoning about statistical inference on standard Borel spaces. We define relative entropy as a functor into Lawvere's category and we show convexity, lower semicontinuity and uniqueness.Comment: 16 page

    A categorical characterization of relative entropy on standard Borel spaces

    Get PDF
    We give a categorical treatment, in the spirit of Baez and Fritz, of relative entropy for probability distributions defined on standard Borel spaces. We define a category suitable for reasoning about statistical inference on standard Borel spaces. We define relative entropy as a functor into Lawvere's category and we show convexity, lower semicontinuity and uniqueness

    Markov Categories and Entropy

    Full text link
    Markov categories are a novel framework to describe and treat problems in probability and information theory. In this work we combine the categorical formalism with the traditional quantitative notions of entropy, mutual information, and data processing inequalities. We show that several quantitative aspects of information theory can be captured by an enriched version of Markov categories, where the spaces of morphisms are equipped with a divergence or even a metric. As it is customary in information theory, mutual information can be defined as a measure of how far a joint source is from displaying independence of its components. More strikingly, Markov categories give a notion of determinism for sources and channels, and we can define entropy exactly by measuring how far a source or channel is from being deterministic. This recovers Shannon and R\'enyi entropies, as well as the Gini-Simpson index used in ecology to quantify diversity, and it can be used to give a conceptual definition of generalized entropy.Comment: 54 page

    Markov categories and entropy

    Get PDF
    Markov categories are a novel framework to describe and treat problems in probability and information theory. In this work we combine the categorical formalism with the traditional quantitative notions of entropy, mutual information, and data processing inequalities. We show that several quantitative aspects of information theory can be captured by an enriched version of Markov categories, where the spaces of morphisms are equipped with a divergence or even a metric. Following standard practices of information theory, we get measures of mutual information by quantifying, with a chosen divergence, how far a joint source is from displaying independence of its components. More strikingly, Markov categories give a notion of determinism for sources and channels, and we can define entropy exactly by quantifying how far a source or channel is from being deterministic. This recovers Shannon and RĂ©nyi entropies, as well as the Gini-Simpson index used in ecology to quantify diversity, and it can be used to give a conceptual definition of generalized entropy. No previous knowledge of category theory is assumed

    Causality in Schwinger's Picture of Quantum Mechanics

    Get PDF
    This paper begins the study of the relation between causality and quantum mechanics, taking advantage of the groupoidal description of quantum mechanical systems inspired by Schwinger’s picture of quantum mechanics. After identifying causal structures on groupoids with a particular class of subcategories, called causal categories accordingly, it will be shown that causal structures can be recovered from a particular class of non-selfadjoint class of algebras, known as triangular operator algebras, contained in the von Neumann algebra of the groupoid of the quantum system. As a consequence of this, Sorkin’s incidence theorem will be proved and some illustrative examples will be discussed.This researchwas funded by the Spanish Ministry of Economy and Competitiveness (MINECO), through the Severo Ochoa Programme for Centres of Excellence in RD (SEV-2015/0554), the MINECO research project PID2020-117477GB-I00, the Comunidad de Madrid project QUITEMAD+, S2013/ICE- 2801, the CONEX-Plus programme (University Carlos III of Madrid), Marie Sklodowska-Curie COFUND Action (H2020-MSCA-COFUND-2017-GA 801538). This work has been supported by the Madrid Government (Comunidad de Madrid-Spain) under the Multiannual Agreement with UC3M in the line of “Research Funds for Beatriz Galindo Fellowships” (C&QIG-BG-CM-UC3M), and in the context of the V PRICIT (Regional Programme of Research and Technological Innovation)

    Dilations and information flow axioms in categorical probability

    Get PDF
    We study the positivity and causality axioms for Markov categories as properties of dilations and information flow in Markov categories, and in variations thereof for arbitrary semicartesian monoidal categories. These help us show that being a positive Markov category is merely an additional property of a symmetric monoidal category (rather than extra structure). We also characterize the positivity of representable Markov categories and prove that causality implies positivity, but not conversely. Finally, we note that positivity fails for quasi-Borel spaces and interpret this failure as a privacy property of probabilistic name generation.Comment: 42 page
    • …
    corecore