147 research outputs found
Meta SOS - A Maude Based SOS Meta-Theory Framework
Meta SOS is a software framework designed to integrate the results from the
meta-theory of structural operational semantics (SOS). These results include
deriving semantic properties of language constructs just by syntactically
analyzing their rule-based definition, as well as automatically deriving sound
and ground-complete axiomatizations for languages, when considering a notion of
behavioural equivalence. This paper describes the Meta SOS framework by
blending aspects from the meta-theory of SOS, details on their implementation
in Maude, and running examples.Comment: In Proceedings EXPRESS/SOS 2013, arXiv:1307.690
Dilation of states and processes in operational-probabilistic theories
This paper provides a concise summary of the framework of
operational-probabilistic theories, aimed at emphasizing the interaction
between category-theoretic and probabilistic structures. Within this framework,
we review an operational version of the GNS construction, expressed by the
so-called purification principle, which under mild hypotheses leads to an
operational version of Stinespring's theorem.Comment: In Proceedings QPL 2014, arXiv:1412.810
Measuring School Segregation
Using only ordinal axioms, we characterize several multigroup school segregation indices:� the Atkinson Indices for the class of school districts with a given fixed number of ethnic groups and the Mutual Information Index for the class of all districts.� Properties of other school segregation indices are also discussed.� In an empirical application, we document a weakening of the effect of ethnicity on school assignment from 1987/8 to 2007/8.� We also show that segregation between districts within cities currently accounts for 33% of total segregation.� Segregation between states, driven mainly by the distinct residental patterns of Hispanics, contributes another 32%.Segregation; measurement; indices
Formal verification of higher-order probabilistic programs
Probabilistic programming provides a convenient lingua franca for writing
succinct and rigorous descriptions of probabilistic models and inference tasks.
Several probabilistic programming languages, including Anglican, Church or
Hakaru, derive their expressiveness from a powerful combination of continuous
distributions, conditioning, and higher-order functions. Although very
important for practical applications, these combined features raise fundamental
challenges for program semantics and verification. Several recent works offer
promising answers to these challenges, but their primary focus is on semantical
issues.
In this paper, we take a step further and we develop a set of program logics,
named PPV, for proving properties of programs written in an expressive
probabilistic higher-order language with continuous distributions and operators
for conditioning distributions by real-valued functions. Pleasingly, our
program logics retain the comfortable reasoning style of informal proofs thanks
to carefully selected axiomatizations of key results from probability theory.
The versatility of our logics is illustrated through the formal verification of
several intricate examples from statistics, probabilistic inference, and
machine learning. We further show the expressiveness of our logics by giving
sound embeddings of existing logics. In particular, we do this in a parametric
way by showing how the semantics idea of (unary and relational) TT-lifting can
be internalized in our logics. The soundness of PPV follows by interpreting
programs and assertions in quasi-Borel spaces (QBS), a recently proposed
variant of Borel spaces with a good structure for interpreting higher order
probabilistic programs
Essays in structural heuristics
This dissertation introduces and develops a new method of rational reconstruction called structural heuristics. Structural heuristics takes assignment of structure to any given object of investigation as the starting point for its rational reconstruction. This means to look at any given object as a system of relations and of transformation laws for those relations. The operational content of this heuristics can be summarized as follows: when facing any given system the best way to approach it is to explicitly look for a possible structure of it. The utilization of structural heuristics allows structural awareness, which is considered a fundamental epistemic disposition, as well as a fundamental condition for the rational reconstruction of systems of knowledge.
In this dissertation, structural heuristics is applied to reconstructing the domain of economic knowledge. This is done by exploring four distinct areas of economic research: (i) economic axiomatics; (ii) realism in economics; (iii) production theory; (iv) economic psychology. The application of structural heuristics to these fields of economic inquiry shows the flexibility and potential of structural heuristics as epistemic tool for theoretical exploration and reconstruction
Measuring school segregation
Using only ordinal axioms, we characterize several multigroup school segregation indices: the Atkinson Indices for the class of school districts with a given fixed number of ethnic groups and the Mutual Information Index for the class of all districts. Properties of other school segregation indices are also discussed. In an empirical application, we document a weakening of the effect of ethnicity on school assignment from 1987/8 to 2007/8. We also show that segregation between districts within cities currently accounts for 33% of total segregation. Segregation between states, driven mainly by the distinct residential patterns of Hispanics, contributes another 32%
Investigations into Proof Structures
We introduce and elaborate a novel formalism for the manipulation and
analysis of proofs as objects in a global manner. In this first approach the
formalism is restricted to first-order problems characterized by condensed
detachment. It is applied in an exemplary manner to a coherent and
comprehensive formal reconstruction and analysis of historical proofs of a
widely-studied problem due to {\L}ukasiewicz. The underlying approach opens the
door towards new systematic ways of generating lemmas in the course of proof
search to the effects of reducing the search effort and finding shorter proofs.
Among the numerous reported experiments along this line, a proof of
{\L}ukasiewicz's problem was automatically discovered that is much shorter than
any proof found before by man or machine.Comment: This article is a continuation of arXiv:2104.1364
Potential Games and Interactive Decisions with Multiple Criteria.
Abstract: Game theory is a mathematical theory for analyzing strategic interaction between decision makers. This thesis covers two game-theoretic topics. The first part of this thesis deals with potential games: noncooperative games in which the information about the goals of the separate players that is required to determine equilibria, can be aggregated into a single function. The structure of different types of potential games is investigated. Congestion problems and the financing of public goods through voluntary contributions are studied in this framework. The second part of the thesis abandons the common assumption that each player is guided by a single goal. It takes into account players who are guided by several, possibly conflicting, objective functions.
Bridging the gap between general probabilistic theories and the device-independent framework for nonlocality and contextuality
Characterizing quantum correlations in terms of information-theoretic
principles is a popular chapter of quantum foundations. Traditionally, the
principles adopted for this scope have been expressed in terms of conditional
probability distributions, specifying the probability that a black box produces
a certain output upon receiving a certain input. This framework is known as
"device-independent". Another major chapter of quantum foundations is the
information-theoretic characterization of quantum theory, with its sets of
states and measurements, and with its allowed dynamics. The different
frameworks adopted for this scope are known under the umbrella term "general
probabilistic theories". With only a few exceptions, the two programmes on
characterizing quantum correlations and characterizing quantum theory have so
far proceeded on separate tracks, each one developing its own methods and its
own agenda. This paper aims at bridging the gap, by comparing the two
frameworks and illustrating how the two programmes can benefit each other.Comment: 61 pages, no figures, published versio
- …