1,301 research outputs found
A foundation for synthesising programming language semantics
Programming or scripting languages used in real-world systems are seldom designed
with a formal semantics in mind from the outset. Therefore, the first step for developing well-founded analysis tools for these systems is to reverse-engineer a formal
semantics. This can take months or years of effort.
Could we automate this process, at least partially? Though desirable, automatically reverse-engineering semantics rules from an implementation is very challenging,
as found by Krishnamurthi, Lerner and Elberty. They propose automatically learning
desugaring translation rules, mapping the language whose semantics we seek to a simplified, core version, whose semantics are much easier to write. The present thesis
contains an analysis of their challenge, as well as the first steps towards a solution.
Scaling methods with the size of the language is very difficult due to state space
explosion, so this thesis proposes an incremental approach to learning the translation
rules. I present a formalisation that both clarifies the informal description of the challenge by Krishnamurthi et al, and re-formulates the problem, shifting the focus to the
conditions for incremental learning. The central definition of the new formalisation is
the desugaring extension problem, i.e. extending a set of established translation rules
by synthesising new ones.
In a synthesis algorithm, the choice of search space is important and non-trivial,
as it needs to strike a good balance between expressiveness and efficiency. The rest
of the thesis focuses on defining search spaces for translation rules via typing rules.
Two prerequisites are required for comparing search spaces. The first is a series of
benchmarks, a set of source and target languages equipped with intended translation
rules between them. The second is an enumerative synthesis algorithm for efficiently
enumerating typed programs. I show how algebraic enumeration techniques can be applied to enumerating well-typed translation rules, and discuss the properties expected
from a type system for ensuring that typed programs be efficiently enumerable.
The thesis presents and empirically evaluates two search spaces. A baseline search
space yields the first practical solution to the challenge. The second search space is
based on a natural heuristic for translation rules, limiting the usage of variables so that
they are used exactly once. I present a linear type system designed to efficiently enumerate translation rules, where this heuristic is enforced. Through informal analysis
and empirical comparison to the baseline, I then show that using linear types can speed
up the synthesis of translation rules by an order of magnitude
Some Aspects of the Theology of the City in ANE Literature and Biblical Protology and Eschatology: A Comparative Study
The city is an essential accomplishment that is embedded in the foundations of human civilization. From its mature appearance in Sumer and its developed forms throughout the ANE world, the city held a high place in cosmology, cosmogony, and anthropogony. The ideology and theology of the city created by the ANE peoples were built around and presented through the interplay of the triangle of influences and dependencies formed by the city, the temple, and kingship in conjunction with the gods. The question is whether the same construct is ingeminated in the Bible. This dissertation strives to provide an appropriate context in order to critically assess the relatedness between the ANE and biblical views on the city, specifically from the perspective of the biblical protology (Genesis 1â11) and eschatology (Revelation 21â22). It also aims to understand the biblical attitudes towards the city, their coordination and complementarity in addressing the ANE views, their conceptual direction, as well as their theoretical and practical consequences
Shoggoth: A Formal Foundation for Strategic Rewriting
Rewriting is a versatile and powerful technique used in many domains. Strategic rewriting allows programmers to control the application of rewrite rules by composing individual rewrite rules into complex rewrite strategies. These strategies are semantically complex, as they may be nondeterministic, they may raise errors that trigger backtracking, and they may not terminate.Given such semantic complexity, it is necessary to establish a formal understanding of rewrite strategies and to enable reasoning about them in order to answer questions like: How do we know that a rewrite strategy terminates? How do we know that a rewrite strategy does not fail because we compose two incompatible rewrites? How do we know that a desired property holds after applying a rewrite strategy?In this paper, we introduce Shoggoth: a formal foundation for understanding, analysing and reasoning about strategic rewriting that is capable of answering these questions. We provide a denotational semantics of System S, a core language for strategic rewriting, and prove its equivalence to our big-step operational semantics, which extends existing work by explicitly accounting for divergence. We further define a location-based weakest precondition calculus to enable formal reasoning about rewriting strategies, and we prove this calculus sound with respect to the denotational semantics. We show how this calculus can be used in practice to reason about properties of rewriting strategies, including termination, that they are well-composed, and that desired postconditions hold. The semantics and calculus are formalised in Isabelle/HOL and all proofs are mechanised
Formalizing, Verifying and Applying ISA Security Guarantees as Universal Contracts
Progress has recently been made on specifying instruction set architectures
(ISAs) in executable formalisms rather than through prose. However, to date,
those formal specifications are limited to the functional aspects of the ISA
and do not cover its security guarantees. We present a novel, general method
for formally specifying an ISAs security guarantees to (1) balance the needs of
ISA implementations (hardware) and clients (software), (2) can be
semi-automatically verified to hold for the ISA operational semantics,
producing a high-assurance mechanically-verifiable proof, and (3) support
informal and formal reasoning about security-critical software in the presence
of adversarial code. Our method leverages universal contracts: software
contracts that express bounds on the authority of arbitrary untrusted code.
Universal contracts can be kept agnostic of software abstractions, and strike
the right balance between requiring sufficient detail for reasoning about
software and preserving implementation freedom of ISA designers and CPU
implementers. We semi-automatically verify universal contracts against Sail
implementations of ISA semantics using our Katamaran tool; a semi-automatic
separation logic verifier for Sail which produces machine-checked proofs for
successfully verified contracts. We demonstrate the generality of our method by
applying it to two ISAs that offer very different security primitives: (1)
MinimalCaps: a custom-built capability machine ISA and (2) a (somewhat
simplified) version of RISC-V with PMP. We verify a femtokernel using the
security guarantee we have formalized for RISC-V with PMP
Morpheus: Automated Safety Verification of Data-Dependent Parser Combinator Programs
Parser combinators are a well-known mechanism used for the compositional construction of parsers, and have shown to be particularly useful in writing parsers for rich grammars with data-dependencies and global state. Verifying applications written using them, however, has proven to be challenging in large part because of the inherently effectful nature of the parsers being composed and the difficulty in reasoning about the arbitrarily rich data-dependent semantic actions that can be associated with parsing actions. In this paper, we address these challenges by defining a parser combinator framework called Morpheus equipped with abstractions for defining composable effects tailored for parsing and semantic actions, and a rich specification language used to define safety properties over the constituent parsers comprising a program. Even though its abstractions yield many of the same expressivity benefits as other parser combinator systems, Morpheus is carefully engineered to yield a substantially more tractable automated verification pathway. We demonstrate its utility in verifying a number of realistic, challenging parsing applications, including several cases that involve non-trivial data-dependent relations
"Le present est plein de lâavenir, et chargĂ© du passĂ©" : VortrĂ€ge des XI. Internationalen Leibniz-Kongresses, 31. Juli â 4. August 2023, Leibniz UniversitĂ€t Hannover, Deutschland. Band 2
[No abstract available]Deutschen Forschungsgemeinschaft (DFG)/Projektnr. 517991912VGH VersicherungNiedersĂ€chsisches Ministerium fĂŒr Wissenschaft und Kultur (MWK
"Le present est plein de lâavenir, et chargĂ© du passĂ©" : VortrĂ€ge des XI. Internationalen Leibniz-Kongresses, 31. Juli â 4. August 2023, Leibniz UniversitĂ€t Hannover, Deutschland. Band 3
[No abstract available]Deutschen Forschungsgemeinschaft (DFG)/Projektnr. 517991912VGH VersicherungNiedersĂ€chsisches Ministerium fĂŒr Wissenschaft und Kultur (MWK
- âŠ