12,076 research outputs found
On choice rules in dependent type theory
In a dependent type theory satisfying the propositions as
types correspondence together with the proofs-as-programs paradigm,
the validity of the unique choice rule or even more of the choice rule says
that the extraction of a computable witness from an existential statement
under hypothesis can be performed within the same theory.
Here we show that the unique choice rule, and hence the choice rule,
are not valid both in Coquand\u2019s Calculus of Constructions with indexed
sum types, list types and binary disjoint sums and in its predicative
version implemented in the intensional level of the Minimalist Founda-
tion. This means that in these theories the extraction of computational
witnesses from existential statements must be performed in a more ex-
pressive proofs-as-programs theory
An Abstract Interpretation for ML Equality Kinds
The definition of Standard ML provides a form of generic equality which is inferred for certain types, called equality types, on which it is possible to define an equality relation in ML. However, the standard definition is incomplete in the sense that there are interesting and useful types which are not inferred to be equality types but for which an equality relation can be defined in ML in a uniform manner. In this paper, a refinement of the Standard ML system of equality types is introduced and is proven sound and complete with respect to the existence of a definable equality. The technique used here is based on an abstract interpretation of ML operators as monotone functions over a three point lattice. It is shown how the equality relation can be defined (as an ML program) from the definition of a type with our equality property. Finally, a sound, efficient algorithm for inferring the equality property which corrects the limitations of the standard definition in all cases of practical interest is demonstrated
Adding HL7 version 3 data types to PostgreSQL
The HL7 standard is widely used to exchange medical information
electronically. As a part of the standard, HL7 defines scalar communication
data types like physical quantity, point in time and concept descriptor but
also complex types such as interval types, collection types and probabilistic
types. Typical HL7 applications will store their communications in a database,
resulting in a translation from HL7 concepts and types into database types.
Since the data types were not designed to be implemented in a relational
database server, this transition is cumbersome and fraught with programmer
error. The purpose of this paper is two fold. First we analyze the HL7 version
3 data type definitions and define a number of conditions that must be met, for
the data type to be suitable for implementation in a relational database. As a
result of this analysis we describe a number of possible improvements in the
HL7 specification. Second we describe an implementation in the PostgreSQL
database server and show that the database server can effectively execute
scientific calculations with units of measure, supports a large number of
operations on time points and intervals, and can perform operations that are
akin to a medical terminology server. Experiments on synthetic data show that
the user defined types perform better than an implementation that uses only
standard data types from the database server.Comment: 12 pages, 9 figures, 6 table
Static Safety for an Actor Dedicated Process Calculus by Abstract Interpretation
The actor model eases the definition of concurrent programs with non uniform
behaviors. Static analysis of such a model was previously done in a data-flow
oriented way, with type systems. This approach was based on constraint set
resolution and was not able to deal with precise properties for communications
of behaviors. We present here a new approach, control-flow oriented, based on
the abstract interpretation framework, able to deal with communication of
behaviors. Within our new analyses, we are able to verify most of the previous
properties we observed as well as new ones, principally based on occurrence
counting
Matching Logic
This paper presents matching logic, a first-order logic (FOL) variant for
specifying and reasoning about structure by means of patterns and pattern
matching. Its sentences, the patterns, are constructed using variables,
symbols, connectives and quantifiers, but no difference is made between
function and predicate symbols. In models, a pattern evaluates into a power-set
domain (the set of values that match it), in contrast to FOL where functions
and predicates map into a regular domain. Matching logic uniformly generalizes
several logical frameworks important for program analysis, such as:
propositional logic, algebraic specification, FOL with equality, modal logic,
and separation logic. Patterns can specify separation requirements at any level
in any program configuration, not only in the heaps or stores, without any
special logical constructs for that: the very nature of pattern matching is
that if two structures are matched as part of a pattern, then they can only be
spatially separated. Like FOL, matching logic can also be translated into pure
predicate logic with equality, at the same time admitting its own sound and
complete proof system. A practical aspect of matching logic is that FOL
reasoning with equality remains sound, so off-the-shelf provers and SMT solvers
can be used for matching logic reasoning. Matching logic is particularly
well-suited for reasoning about programs in programming languages that have an
operational semantics, but it is not limited to this
Collection analysis for Horn clause programs
We consider approximating data structures with collections of the items that
they contain. For examples, lists, binary trees, tuples, etc, can be
approximated by sets or multisets of the items within them. Such approximations
can be used to provide partial correctness properties of logic programs. For
example, one might wish to specify than whenever the atom is proved
then the two lists and contain the same multiset of items (that is,
is a permutation of ). If sorting removes duplicates, then one would like to
infer that the sets of items underlying and are the same. Such results
could be useful to have if they can be determined statically and automatically.
We present a scheme by which such collection analysis can be structured and
automated. Central to this scheme is the use of linear logic as a omputational
logic underlying the logic of Horn clauses
- …