20 research outputs found
From coinductive proofs to exact real arithmetic: theory and applications
Based on a new coinductive characterization of continuous functions we
extract certified programs for exact real number computation from constructive
proofs. The extracted programs construct and combine exact real number
algorithms with respect to the binary signed digit representation of real
numbers. The data type corresponding to the coinductive definition of
continuous functions consists of finitely branching non-wellfounded trees
describing when the algorithm writes and reads digits. We discuss several
examples including the extraction of programs for polynomials up to degree two
and the definite integral of continuous maps
Programming Languages and Systems
This open access book constitutes the proceedings of the 29th European Symposium on Programming, ESOP 2020, which was planned to take place in Dublin, Ireland, in April 2020, as Part of the European Joint Conferences on Theory and Practice of Software, ETAPS 2020. The actual ETAPS 2020 meeting was postponed due to the Corona pandemic. The papers deal with fundamental issues in the specification, design, analysis, and implementation of programming languages and systems
Arrows for knowledge-based circuits
Knowledge-based programs (KBPs) are a formalism for directly relating agents' knowledge and behaviour in a way that has proven useful for specifying distributed systems. Here we present a scheme for compiling KBPs to executable automata in finite environments with a proof of correctness in Isabelle/HOL. We use Arrows, a functional programming abstraction, to structure a prototype domain-specific synchronous language embedded in Haskell. By adapting our compilation scheme to use symbolic representations we can apply it to several examples of reasonable size
Emerging trends proceedings of the 17th International Conference on Theorem Proving in Higher Order Logics: TPHOLs 2004
technical reportThis volume constitutes the proceedings of the Emerging Trends track of the 17th International Conference on Theorem Proving in Higher Order Logics (TPHOLs 2004) held September 14-17, 2004 in Park City, Utah, USA. The TPHOLs conference covers all aspects of theorem proving in higher order logics as well as related topics in theorem proving and verification. There were 42 papers submitted to TPHOLs 2004 in the full research cate- gory, each of which was refereed by at least 3 reviewers selected by the program committee. Of these submissions, 21 were accepted for presentation at the con- ference and publication in volume 3223 of Springer?s Lecture Notes in Computer Science series. In keeping with longstanding tradition, TPHOLs 2004 also offered a venue for the presentation of work in progress, where researchers invite discussion by means of a brief introductory talk and then discuss their work at a poster session. The work-in-progress papers are held in this volume, which is published as a 2004 technical report of the School of Computing at the University of Utah
The use of proof plans in tactic synthesis
We undertake a programme of tactic synthesis. We first formalize the notion of
a tactic as a rewrite rule, then give a correctness criterion for this by means of a
reflection mechanism in the constructive type theory OYSTER. We further formalize
the notion of a tactic specification, given as a synthesis goal and a decidability
goal. We use a proof planner. CIAM. to guide the search for inductive proofs
of these, and are able to successfully synthesize several tactics in this fashion.
This involves two extensions to existing methods: context-sensitive rewriting and
higher-order wave rules. Further, we show that from a proof of the decidability
goal one may compile to a Prolog program a pseudo- tactic which may be run to
efficiently simulate the input/output behaviour of the synthetic tacti
Recommended from our members
Formalizing graphical notations
The thesis describes research into graphical notations for software engineering, with a principal interest in ways of formalizing them. The research seeks to provide a theoretical basis that will help in designing both notations and the software tools that process them.
The work starts from a survey of literature on notation, followed by a review of techniques for formal description and for computational handling of notations. The survey concentrates on collecting views of the benefits and the problems attending notation use in software development; the review covers picture description languages, grammars and tools such as generic editors and visual programming environments. The main problem of notation is found to be a lack of any coherent, rigorous description methods. The current approaches to this problem are analysed as lacking in consensus on syntax specification and also lacking a clear focus on a defined concept of notated expression.
To address these deficiencies, the thesis embarks upon an exploration of serniotic, linguistic and logical theory; this culminates in a proposed formalization of serniosis in notations, using categorial model theory as a mathematical foundation. An argument about the structure of sign systems leads to an analysis of notation into a layered system of tractable theories, spanning the gap between expressive pictorial medium and subject domain. This notion of 'tectonic' theory aims to treat both diagrams and formulae together.
The research gives details of how syntactic structure can be sketched in a mathematical sense, with examples applying to software development diagrams, offering a new solution to the problem of notation specification. Based on these methods, the thesis discusses directions for resolving the harder problems of supporting notation design, processing and computer-aided generic editing. A number of future research areas are thereby opened up. For practical trial of the ideas, the work proceeds to the development and partial implementation of a system to aid the design of notations and editors. Finally the thesis is evaluated as a contribution to theory in an area which has not attracted a standard approach
Mechanizing Abstract Interpretation
It is important when developing software to verify the absence of undesirable
behavior such as crashes, bugs and security vulnerabilities. Some settings
require high assurance in verification results, e.g., for embedded software in
automobiles or airplanes. To achieve high assurance in these verification
results, formal methods are used to automatically construct or check proofs of
their correctness. However, achieving high assurance for program analysis
results is challenging, and current methods are ill suited for both complex
critical domains and mainstream use.
To verify the correctness of software we consider program analyzers---automated
tools which detect software defects---and to achieve high assurance in
verification results we consider mechanized verification---a rigorous process
for establishing the correctness of program analyzers via computer-checked
proofs.
The key challenges to designing verified program analyzers are: (1) achieving
an analyzer design for a given programming language and correctness property;
(2) achieving an implementation for the design; and (3) achieving a mechanized
verification that the implementation is correct w.r.t. the design. The state of
the art in (1) and (2) is to use abstract interpretation: a guiding
mathematical framework for systematically constructing analyzers directly from
programming language semantics. However, achieving (3) in the presence of
abstract interpretation has remained an open problem since the late 1990's.
Furthermore, even the state-of-the art which achieves (3) in the absence of
abstract interpretation suffers from the inability to be reused in the presence
of new analyzer designs or programming language features.
First, we solve the open problem which has prevented the combination of
abstract interpretation (and in particular, calculational abstract
interpretation) with mechanized verification, which advances the state of the
art in designing, implementing, and verifying analyzers for critical software.
We do this through a new mathematical framework Constructive Galois Connections
which supports synthesizing specifications for program analyzers, calculating
implementations from these induced specifications, and is amenable to
mechanized verification.
Finally, we introduce reusable components for implementing analyzers for a wide
range of designs and semantics. We do this though two new frameworks Galois
Transformers and Definitional Abstract Interpreters. These frameworks tightly
couple analyzer design decisions, implementation fragments, and verification
properties into compositional components which are (target)
programming-language independent and amenable to mechanized verification.
Variations in the analysis design are then recovered by simply re-assembling
the combination of components. Using this framework, sophisticated program
analyzers can be assembled by non-experts, and the result are guaranteed to be
verified by construction
Program extraction from coinductive proofs and its application to exact real arithmetic
Program extraction has been initiated in the field of constructive
mathematics, and it attracts interest not only from mathematicians but
also from computer scientists nowadays. From a mathematical viewpoint
its aim is to figure out computational meaning of proofs, while from a
computer-scientific viewpoint its aim is the study of a method to
obtain correct programs. Therefore, it is natural to have both
theoretical results and a practical computer system to develop
executable programs via program extraction.
In this Thesis we study the computational interpretation of
constructive proofs involving inductive and coinductive reasoning. We
interpret proofs by translating the computational content of proofs
into executable program code. This translation is the procedure we
call program extraction and it is given through Kreisel's modified
realizability. Here we study a proof-theoretic foundation for program
extraction, enriching the proof assistant system Minlog based on this
theoretical improvement. Once a proof of a formula is written in
Minlog, a program can be extracted from the proof by the system
itself, and the extracted program can be executed in Minlog.
Moreover, extracted programs are provably correct with respect to the
proven formula due to a soundness theorem which we prove. We practice
program extraction by elaborating some case studies from exact real
arithmetic within our formal theory. Although these case studies have
been studied elsewhere, here we offer a formalization of them in
Minlog, and also machine-extraction of the corresponding programs.Die Methode der Programmextraktion hat ihren Ursprung im Bereich der
konstruktiven Mathematik, und stößt in letzter Zeit auf viel
Interesse nicht nur bei Mathematikern sondern auch bei Informatikern.
Vom Standpunkt der Mathematik ist ihr Ziel, aus Beweisen ihre
rechnerische Bedeutung abzulesen, während vom Standpunkt der
Informatik ihr Ziel die Untersuchung einer Methode ist, beweisbar
korrekte Programme zu erhalten. Es ist deshalb naheliegend, neben
theoretischen Ergebnissen auch ein praktisches Computersystem zur
Verfügung zu haben, mit dessen Hilfe durch Programmextraktion
lauffähige Programme entwickelt werden können.
In dieser Doktorarbeit wird eine rechnerische Interpretation
konstruktiver Beweise mit induktiven und koinduktiven Definitionen
angegeben und untersucht. Die Interpretation geschieht dadurch,
daß der rechnerische Gehalt von Beweisen in eine
Programmiersprache übersetzt wird. Diese übersetzung wird
Programmextraktion genannt; sie basiert auf Kreisels modifizierter
Realisierbarkeit. Wir untersuchen die beweistheoretischen Grundlagen
der Programmextraktion und erweitern den Beweisassistenten Minlog auf
der Basis der erhaltenen theoretischen Resultate. Wenn eine Formel in
Minlog formal bewiesen ist, läßt sich ein Programm aus dem
Beweis extrahieren, und dieses extrahierte Programm kann in Minlog
ausgeführt werden. Ferner sind extrahierte Programme beweisbar
korrekt bezüglich der entsprechenden Formel aufgrund eines
Korrektheitsatzes, den wir beweisen werden. Innerhalb unserer
formalen Theorie bearbeiten wir einige aus der Literatur bekannte
Fallstudien im Bereich der exakten reellen Arithmetik. Wir entwickeln
eine vollständige Formalisierung der entsprechenden Beweise und
diskutieren die in Minlog automatisch extrahierten Programme
Logische Grundlagen von Datenbanktransformationen für Datenbanken mit komplexen Typen
Database transformations consist of queries and updates which are two fundamental types of computations in any databases - the first provides the capability to retrieve data and the second is used to maintain databases in light of ever-changing application domains. With the rising
popularity of web-based applications and service-oriented architectures, the development of database transformations must address new challenges, which frequently call for establishing a theoretical framework that unifies both queries and updates over complex-value databases. This dissertation aims to lay down the foundations for establishing a theoretical framework of database transformations in the context of complex-value databases.
We shall use an approach that has successfully been used for the characterisation of sequential algorithms. The sequential Abstract State Machine (ASM) thesis captures semantics and behaviour of sequential algorithms. The thesis uses the similarity of general computations and database transformations for characterisation of the later by five postulates: sequential time postulate, abstract state postulate, bounded exploration postulate, background postulate, and the bounded non-determinism postulate. The last two postulates reflect the specific form of transformations for databases. The five postulates exactly capture database transformations. Furthermore, we provide a logical proof system for database transformations that is sound and complete.Datenbanktransformationen sind Anfragen an ein Datenbanksystem oder Modifikationen der Daten des Datenbanksystemes. Diese beiden grundlegenden Arten von Berechnungen auf Datenbanksystemen erlauben zum einem den Zugriff auf Daten und zum anderen die Pflege der Datenbank. Eine theoretische Fundierung von Datenbanktransformationen muss so flexibel sein, dass auch neue web-basierten Anwendungen und den neuen serviceorientierte Architekturen reflektiert sind, sowie auch die komplexeren Datenstrukturen. Diese Dissertation legt die Grundlagen für eine Theoriefundierung durch Datenbanktransformationen, die auch komplexe Datenstrukturen unterstützen. Wir greifen dabei auf einen Zugang zurück, der eine Theorie der sequentiellen Algorithmen bietet. Die sequentielle ASM-These (abstrakte Zustandsmaschinen) beschreibt die Semantik und das Verhalten sequentieller Algorithmen. Die Dissertation nutzt dabei die Gleichartigkeit von allgemeinen Berechnungen und Datenbanktransformationen zur Charakterisierung durch fünf Postulate bzw. Axiome: das Axiom der sequentiellen Ausführung, das Axiom einer abstrakten Charakterisierbarkeit von Zuständen, das Axiom der Begrenzbarkeit von Zustandsänderungen und Zustandssicht, das Axiom der Strukturierung von Datenbanken und das Axiom der Begrenzbarkeit des Nichtdeterminismus. Die letzten beiden Axiome reflektieren die spezifische Seite der Datenbankberechnungen. Die fünf Axiome beschreiben vollständig das Verhalten von Datenbanktransformationen. Weiterhin wird eine Beweiskalkül für Datenbanktransformationen entwickelt, der vollständig und korrekt ist