609 research outputs found
Convolution, Separation and Concurrency
A notion of convolution is presented in the context of formal power series
together with lifting constructions characterising algebras of such series,
which usually are quantales. A number of examples underpin the universality of
these constructions, the most prominent ones being separation logics, where
convolution is separating conjunction in an assertion quantale; interval
logics, where convolution is the chop operation; and stream interval functions,
where convolution is used for analysing the trajectories of dynamical or
real-time systems. A Hoare logic is constructed in a generic fashion on the
power series quantale, which applies to each of these examples. In many cases,
commutative notions of convolution have natural interpretations as concurrency
operations.Comment: 39 page
Information Physics: The New Frontier
At this point in time, two major areas of physics, statistical mechanics and
quantum mechanics, rest on the foundations of probability and entropy. The last
century saw several significant fundamental advances in our understanding of
the process of inference, which make it clear that these are inferential
theories. That is, rather than being a description of the behavior of the
universe, these theories describe how observers can make optimal predictions
about the universe. In such a picture, information plays a critical role. What
is more is that little clues, such as the fact that black holes have entropy,
continue to suggest that information is fundamental to physics in general.
In the last decade, our fundamental understanding of probability theory has
led to a Bayesian revolution. In addition, we have come to recognize that the
foundations go far deeper and that Cox's approach of generalizing a Boolean
algebra to a probability calculus is the first specific example of the more
fundamental idea of assigning valuations to partially-ordered sets. By
considering this as a natural way to introduce quantification to the more
fundamental notion of ordering, one obtains an entirely new way of deriving
physical laws. I will introduce this new way of thinking by demonstrating how
one can quantify partially-ordered sets and, in the process, derive physical
laws. The implication is that physical law does not reflect the order in the
universe, instead it is derived from the order imposed by our description of
the universe. Information physics, which is based on understanding the ways in
which we both quantify and process information about the world around us, is a
fundamentally new approach to science.Comment: 17 pages, 6 figures. Knuth K.H. 2010. Information physics: The new
frontier. J.-F. Bercher, P. Bessi\`ere, and A. Mohammad-Djafari (eds.)
Bayesian Inference and Maximum Entropy Methods in Science and Engineering
(MaxEnt 2010), Chamonix, France, July 201
An Intuitionistic Formula Hierarchy Based on High-School Identities
We revisit the notion of intuitionistic equivalence and formal proof
representations by adopting the view of formulas as exponential polynomials.
After observing that most of the invertible proof rules of intuitionistic
(minimal) propositional sequent calculi are formula (i.e. sequent) isomorphisms
corresponding to the high-school identities, we show that one can obtain a more
compact variant of a proof system, consisting of non-invertible proof rules
only, and where the invertible proof rules have been replaced by a formula
normalisation procedure.
Moreover, for certain proof systems such as the G4ip sequent calculus of
Vorob'ev, Hudelmaier, and Dyckhoff, it is even possible to see all of the
non-invertible proof rules as strict inequalities between exponential
polynomials; a careful combinatorial treatment is given in order to establish
this fact.
Finally, we extend the exponential polynomial analogy to the first-order
quantifiers, showing that it gives rise to an intuitionistic hierarchy of
formulas, resembling the classical arithmetical hierarchy, and the first one
that classifies formulas while preserving isomorphism
Syntactic completeness of proper display calculi
A recent strand of research in structural proof theory aims at exploring the
notion of analytic calculi (i.e. those calculi that support general and modular
proof-strategies for cut elimination), and at identifying classes of logics
that can be captured in terms of these calculi. In this context, Wansing
introduced the notion of proper display calculi as one possible design
framework for proof calculi in which the analiticity desiderata are realized in
a particularly transparent way. Recently, the theory of properly displayable
logics (i.e. those logics that can be equivalently presented with some proper
display calculus) has been developed in connection with generalized Sahlqvist
theory (aka unified correspondence). Specifically, properly displayable logics
have been syntactically characterized as those axiomatized by analytic
inductive axioms, which can be equivalently and algorithmically transformed
into analytic structural rules so that the resulting proper display calculi
enjoy a set of basic properties: soundness, completeness, conservativity, cut
elimination and subformula property. In this context, the proof that the given
calculus is complete w.r.t. the original logic is usually carried out
syntactically, i.e. by showing that a (cut free) derivation exists of each
given axiom of the logic in the basic system to which the analytic structural
rules algorithmically generated from the given axiom have been added. However,
so far this proof strategy for syntactic completeness has been implemented on a
case-by-case base, and not in general. In this paper, we address this gap by
proving syntactic completeness for properly displayable logics in any normal
(distributive) lattice expansion signature. Specifically, we show that for
every analytic inductive axiom a cut free derivation can be effectively
generated which has a specific shape, referred to as pre-normal form.Comment: arXiv admin note: text overlap with arXiv:1604.08822 by other author
Recommended from our members
Automated verification of refinement laws
Demonic refinement algebras are variants of Kleene algebras. Introduced by von Wright as a light-weight variant of the refinement calculus, their intended semantics are positively disjunctive predicate transformers, and their calculus is entirely within first-order equational logic. So, for the first time, off-the-shelf automated theorem proving (ATP) becomes available for refinement proofs. We used ATP to verify a toolkit of basic refinement laws. Based on this toolkit, we then verified two classical complex refinement laws for action systems by ATP: a data refinement law and Back's atomicity refinement law. We also present a refinement law for infinite loops that has been discovered through automated analysis. Our proof experiments not only demonstrate that refinement can effectively be automated, they also compare eleven different ATP systems and suggest that program verification with variants of Kleene algebras yields interesting theorem proving benchmarks. Finally, we apply hypothesis learning techniques that seem indispensable for automating more complex proofs
- …