491 research outputs found
Implicit complexity for coinductive data: a characterization of corecurrence
We propose a framework for reasoning about programs that manipulate
coinductive data as well as inductive data. Our approach is based on using
equational programs, which support a seamless combination of computation and
reasoning, and using productivity (fairness) as the fundamental assertion,
rather than bi-simulation. The latter is expressible in terms of the former. As
an application to this framework, we give an implicit characterization of
corecurrence: a function is definable using corecurrence iff its productivity
is provable using coinduction for formulas in which data-predicates do not
occur negatively. This is an analog, albeit in weaker form, of a
characterization of recurrence (i.e. primitive recursion) in [Leivant, Unipolar
induction, TCS 318, 2004].Comment: In Proceedings DICE 2011, arXiv:1201.034
Logical Specification and Analysis of Fault Tolerant Systems through Partial Model Checking
This paper presents a framework for a logical characterisation of fault tolerance and its formal analysis based on partial model checking techniques. The framework requires a fault tolerant system to be modelled using a formal calculus, here the CCS process algebra. To this aim we propose a uniform modelling scheme in which to specify a formal model of the system, its failing behaviour and possibly its fault-recovering procedures. Once a formal model is provided into our scheme, fault tolerance - with respect to a given property - can be formalized as an equational Āµ-calculus formula. This formula expresses in a logic formalism, all the fault scenarios satisfying that fault tolerance property. Such a characterisation understands the analysis of fault tolerance as a form of analysis of open systems and thank to partial model checking strategies, it can be made independent on any particular fault assumption. Moreover this logical characterisation makes possible the fault-tolerance verification problem be expressed as a general Āµ-calculus validation problem, for solving which many theorem proof techniques and tools are available. We present several analysis methods showing the flexibility of our approach
Second-Order Type Isomorphisms Through Game Semantics
The characterization of second-order type isomorphisms is a purely
syntactical problem that we propose to study under the enlightenment of game
semantics. We study this question in the case of second-order
λ-calculus, which can be seen as an extension of system F to
classical logic, and for which we define a categorical framework: control
hyperdoctrines. Our game model of λ-calculus is based on polymorphic
arenas (closely related to Hughes' hyperforests) which evolve during the play
(following the ideas of Murawski-Ong). We show that type isomorphisms coincide
with the "equality" on arenas associated with types. Finally we deduce the
equational characterization of type isomorphisms from this equality. We also
recover from the same model Roberto Di Cosmo's characterization of type
isomorphisms for system F. This approach leads to a geometrical comprehension
on the question of second order type isomorphisms, which can be easily extended
to some other polymorphic calculi including additional programming features.Comment: accepted by Annals of Pure and Applied Logic, Special Issue on Game
Semantic
Q# as a Quantum Algorithmic Language
Q# is a standalone domain-specific programming language from Microsoft for
writing and running quantum programs. Like most industrial languages, it was
designed without a formal specification, which can naturally lead to ambiguity
in its interpretation. We aim to provide a formal language definition for Q#,
placing the language on a solid mathematical foundation and enabling further
evolution of its design and type system. This paper presents -Q#, an
idealized version of Q# that illustrates how we may view Q# as a quantum Algol
(algorithmic language). We show the safety properties enforced by
-Q#'s type system and present its equational semantics based on a
fully complete algebraic theory by Staton.Comment: In Proceedings QPL 2022, arXiv:2311.0837
Term rewriting systems from Church-Rosser to Knuth-Bendix and beyond
Term rewriting systems are important for computability theory of abstract data types, for automatic theorem proving, and for the foundations of functional programming. In this short survey we present, starting from first principles, several of the basic notions and facts in the area of term rewriting. Our treatment, which often will be informal, covers abstract rewriting, Combinatory Logic, orthogonal systems, strategies, critical pair completion, and some extended rewriting formats
- ā¦