486 research outputs found
Types for Information Flow Control: Labeling Granularity and Semantic Models
Language-based information flow control (IFC) tracks dependencies within a
program using sensitivity labels and prohibits public outputs from depending on
secret inputs. In particular, literature has proposed several type systems for
tracking these dependencies. On one extreme, there are fine-grained type
systems (like Flow Caml) that label all values individually and track
dependence at the level of individual values. On the other extreme are
coarse-grained type systems (like HLIO) that track dependence coarsely, by
associating a single label with an entire computation context and not labeling
all values individually.
In this paper, we show that, despite their glaring differences, both these
styles are, in fact, equally expressive. To do this, we show a semantics- and
type-preserving translation from a coarse-grained type system to a fine-grained
one and vice-versa. The forward translation isn't surprising, but the backward
translation is: It requires a construct to arbitrarily limit the scope of a
context label in the coarse-grained type system (e.g., HLIO's "toLabeled"
construct). As a separate contribution, we show how to extend work on logical
relation models of IFC types to higher-order state. We build such logical
relations for both the fine-grained type system and the coarse-grained type
system. We use these relations to prove the two type systems and our
translations between them sound.Comment: 31st IEEE Symposium on Computer Security Foundations (CSF 2018
Data-flow analyses as effects and graded monads
In static analysis, two frameworks have been studied extensively: monotone data-flow analysis and type-and-effect systems. Whilst both are seen as general analysis frameworks, their relationship has remained unclear. Here we show that monotone data-flow analyses can be encoded as effect systems in a uniform way, via algebras of transfer functions. This helps to answer questions about the most appropriate structure for general effect algebras, especially with regards capturing control-flow precisely. Via the perspective of capturing data-flow analyses, we show the recent suggestion of using effect quantales is not general enough as it excludes non-distributive analyses e.g., constant propagation. By rephrasing the McCarthy transformation, we then model monotone data-flow effects
via graded monads. This provides a model of data-flow analyses that can be used to reason about analysis correctness at the semantic level, and to embed data-flow analyses into type systems.Trinity College, Cambridge (Internal Graduate Scholarship)
EPSRC grant EP/T013516/
Linear Haskell: practical linearity in a higher-order polymorphic language
Linear type systems have a long and storied history, but not a clear path
forward to integrate with existing languages such as OCaml or Haskell. In this
paper, we study a linear type system designed with two crucial properties in
mind: backwards-compatibility and code reuse across linear and non-linear users
of a library. Only then can the benefits of linear types permeate conventional
functional programming. Rather than bifurcate types into linear and non-linear
counterparts, we instead attach linearity to function arrows. Linear functions
can receive inputs from linearly-bound values, but can also operate over
unrestricted, regular values.
To demonstrate the efficacy of our linear type system - both how easy it can
be integrated in an existing language implementation and how streamlined it
makes it to write programs with linear types - we implemented our type system
in GHC, the leading Haskell compiler, and demonstrate two kinds of applications
of linear types: mutable data with pure interfaces; and enforcing protocols in
I/O-performing functions
Relational Cost Analysis for Functional-Imperative Programs
Relational cost analysis aims at formally establishing bounds on the
difference in the evaluation costs of two programs. As a particular case, one
can also use relational cost analysis to establish bounds on the difference in
the evaluation cost of the same program on two different inputs. One way to
perform relational cost analysis is to use a relational type-and-effect system
that supports reasoning about relations between two executions of two programs.
Building on this basic idea, we present a type-and-effect system, called
ARel, for reasoning about the relative cost of array-manipulating, higher-order
functional-imperative programs. The key ingredient of our approach is a new
lightweight type refinement discipline that we use to track relations
(differences) between two arrays. This discipline combined with Hoare-style
triples built into the types allows us to express and establish precise
relative costs of several interesting programs which imperatively update their
data.Comment: 14 page
Cogent: uniqueness types and certifying compilation
This paper presents a framework aimed at significantly reducing the cost of proving functional correctness for low-level operating systems components. The framework is designed around a new functional programming language, Cogent. A central aspect of the language is its uniqueness type system, which eliminates the need for a trusted runtime or garbage collector while still guaranteeing memory safety, a crucial property for safety and security. Moreover, it allows us to assign two semantics to the language: The first semantics is imperative, suitable for efficient C code generation, and the second is purely functional, providing a user-friendly interface for equational reasoning and verification of higher-level correctness properties. The refinement theorem connecting the two semantics allows the compiler to produce a proof via translation validation certifying the correctness of the generated C code with respect to the semantics of the Cogent source program. We have demonstrated the effectiveness of our framework for implementation and for verification through two file system implementations
Dynamic IFC Theorems for Free!
We show that noninterference and transparency, the key soundness theorems for
dynamic IFC libraries, can be obtained "for free", as direct consequences of
the more general parametricity theorem of type abstraction. This allows us to
give very short soundness proofs for dynamic IFC libraries such as faceted
values and LIO. Our proofs stay short even when fully mechanized for Agda
implementations of the libraries in terms of type abstraction.Comment: CSF 2021 final versio
Polymonadic Programming
Monads are a popular tool for the working functional programmer to structure
effectful computations. This paper presents polymonads, a generalization of
monads. Polymonads give the familiar monadic bind the more general type forall
a,b. L a -> (a -> M b) -> N b, to compose computations with three different
kinds of effects, rather than just one. Polymonads subsume monads and
parameterized monads, and can express other constructions, including precise
type-and-effect systems and information flow tracking; more generally,
polymonads correspond to Tate's productoid semantic model. We show how to equip
a core language (called lambda-PM) with syntactic support for programming with
polymonads. Type inference and elaboration in lambda-PM allows programmers to
write polymonadic code directly in an ML-like syntax--our algorithms compute
principal types and produce elaborated programs wherein the binds appear
explicitly. Furthermore, we prove that the elaboration is coherent: no matter
which (type-correct) binds are chosen, the elaborated program's semantics will
be the same. Pleasingly, the inferred types are easy to read: the polymonad
laws justify (sometimes dramatic) simplifications, but with no effect on a
type's generality.Comment: In Proceedings MSFP 2014, arXiv:1406.153
- …