468 research outputs found
A categorical characterization of relative entropy on standard Borel spaces
We give a categorical treatment, in the spirit of Baez and Fritz, of relative
entropy for probability distributions defined on standard Borel spaces. We
define a category suitable for reasoning about statistical inference on
standard Borel spaces. We define relative entropy as a functor into Lawvere's
category and we show convexity, lower semicontinuity and uniqueness.Comment: 16 page
A categorical characterization of relative entropy on standard Borel spaces
We give a categorical treatment, in the spirit of Baez and Fritz, of relative
entropy for probability distributions defined on standard Borel spaces. We
define a category suitable for reasoning about statistical inference on
standard Borel spaces. We define relative entropy as a functor into Lawvere's
category and we show convexity, lower semicontinuity and uniqueness
Markov Categories and Entropy
Markov categories are a novel framework to describe and treat problems in
probability and information theory.
In this work we combine the categorical formalism with the traditional
quantitative notions of entropy, mutual information, and data processing
inequalities. We show that several quantitative aspects of information theory
can be captured by an enriched version of Markov categories, where the spaces
of morphisms are equipped with a divergence or even a metric.
As it is customary in information theory, mutual information can be defined
as a measure of how far a joint source is from displaying independence of its
components.
More strikingly, Markov categories give a notion of determinism for sources
and channels, and we can define entropy exactly by measuring how far a source
or channel is from being deterministic. This recovers Shannon and R\'enyi
entropies, as well as the Gini-Simpson index used in ecology to quantify
diversity, and it can be used to give a conceptual definition of generalized
entropy.Comment: 54 page
Markov categories and entropy
Markov categories are a novel framework to describe and treat problems in probability and information theory. In this work we combine the categorical formalism with the traditional quantitative notions of entropy, mutual information, and data processing inequalities. We show that several quantitative aspects of information theory can be captured by an enriched version of Markov categories, where the spaces of morphisms are equipped with a divergence or even a metric. Following standard practices of information theory, we get measures of mutual information by quantifying, with a chosen divergence, how far a joint source is from displaying independence of its components. More strikingly, Markov categories give a notion of determinism for sources and channels, and we can define entropy exactly by quantifying how far a source or channel is from being deterministic. This recovers Shannon and RĂ©nyi entropies, as well as the Gini-Simpson index used in ecology to quantify diversity, and it can be used to give a conceptual definition of generalized entropy. No previous knowledge of category theory is assumed
Causality in Schwinger's Picture of Quantum Mechanics
This paper begins the study of the relation between causality and quantum mechanics, taking
advantage of the groupoidal description of quantum mechanical systems inspired by Schwinger’s
picture of quantum mechanics. After identifying causal structures on groupoids with a particular
class of subcategories, called causal categories accordingly, it will be shown that causal structures
can be recovered from a particular class of non-selfadjoint class of algebras, known as triangular
operator algebras, contained in the von Neumann algebra of the groupoid of the quantum system.
As a consequence of this, Sorkin’s incidence theorem will be proved and some illustrative examples
will be discussed.This researchwas funded by the Spanish Ministry of Economy and Competitiveness (MINECO), through the Severo Ochoa Programme for Centres of Excellence in RD (SEV-2015/0554), the MINECO research project PID2020-117477GB-I00, the Comunidad de Madrid project QUITEMAD+, S2013/ICE- 2801, the CONEX-Plus programme (University Carlos III of Madrid), Marie Sklodowska-Curie COFUND Action (H2020-MSCA-COFUND-2017-GA 801538). This work has been supported by the Madrid Government (Comunidad de Madrid-Spain) under the Multiannual Agreement with UC3M in the line of “Research Funds for Beatriz Galindo Fellowships” (C&QIG-BG-CM-UC3M), and in the context of the V PRICIT (Regional Programme of Research and Technological Innovation)
Recommended from our members
Computability Theory
Computability and computable enumerability are two of the fundamental notions of mathematics. Interest in effectiveness is already apparent in the famous Hilbert problems, in particular the second and tenth, and in early 20th century work of Dehn, initiating the study of word problems in group theory. The last decade has seen both completely new subareas develop as well as remarkable growth in two-way interactions between classical computability theory and areas of applications. There is also a great deal of work on algorithmic randomness, reverse mathematics, computable analysis, and in computable structure theory/computable model theory. The goal of this workshop is to bring together researchers representing different aspects of computability theory to discuss recent advances, and to stimulate future work
Dilations and information flow axioms in categorical probability
We study the positivity and causality axioms for Markov categories as
properties of dilations and information flow in Markov categories, and in
variations thereof for arbitrary semicartesian monoidal categories. These help
us show that being a positive Markov category is merely an additional property
of a symmetric monoidal category (rather than extra structure). We also
characterize the positivity of representable Markov categories and prove that
causality implies positivity, but not conversely. Finally, we note that
positivity fails for quasi-Borel spaces and interpret this failure as a privacy
property of probabilistic name generation.Comment: 42 page
- …