752 research outputs found
An Algebra of Pure Quantum Programming
We develop a sound and complete equational theory for the functional quantum
programming language QML. The soundness and completeness of the theory are with
respect to the previously-developed denotational semantics of QML. The
completeness proof also gives rise to a normalisation algorithm following the
normalisation by evaluation approach. The current work focuses on the pure
fragment of QML omitting measurements.Comment: To appear in ENTCS, 3rd International Workshop on Quantum Programming
Languages, 2005. 21 Page
Applying quantitative semantics to higher-order quantum computing
Finding a denotational semantics for higher order quantum computation is a
long-standing problem in the semantics of quantum programming languages. Most
past approaches to this problem fell short in one way or another, either
limiting the language to an unusably small finitary fragment, or giving up
important features of quantum physics such as entanglement. In this paper, we
propose a denotational semantics for a quantum lambda calculus with recursion
and an infinite data type, using constructions from quantitative semantics of
linear logic
Logical ambiguity
The thesis presents research in the field of model theoretic semantics on the problem of ambiguity,
especially as it arises for sentences that contain junctions (and,or) and quantifiers (every man,
a woman). A number of techniques that have been proposed are surveyed, and I conclude
that these ought to be rejected because they do not make ambiguity 'emergent': they all have
the feature that subtheories would be able to explain all syntactic facts yet would predict no
ambiguity. In other words these accounts have a special purpose mechanism for generating
ambiguities.It is argued that categorial grammars show promise for giving an 'emergent' account. This is
because the only way to take a subtheory of a particular categorial grammar is by changing one
of the small number of clauses by which the categorial grammar axiomatises an infinite set of
syntactic rules, and such a change is likely to have a wider range of effects on the coverage of
the grammar than simply the subtraction of ambiguity.Of categorial grammars proposed to date the most powerful is Lambek Categorial Grammar,
which defines the set of syntactic rules by a notational variant of Gentzen's sequent calculus
for implicational propositional logic, and which defines meaning assignment by using the Curry-
Howard isomorphism between Natural Deduction proofs in implicational propositional logic and
terms of typed lambda calculus. It is shown that no satisfactory account of the junctions and
quantifiers is possible in Lambek categorial grammar.I introduce then a framework that I call Polymorphic Lambek Categorial Grammar, which adds
variables and their universal quantification, to the language of categorisation. The set of syntac¬
tic rules is specified by a notational variant of Gentzen's sequent calculus for quantified proposi¬
tional logic, and which defines meaning assignment by using Girard's Extended Curry-Howard
isomorphism between Natural Deduction proofs in quantified implicational propositional logic
and terms of 2nd order polymorphic lambda calculus. It is shown that this allows an account
of the junctions and quantifiers, and one which is 'emerg
Changing a semantics: opportunism or courage?
The generalized models for higher-order logics introduced by Leon Henkin, and
their multiple offspring over the years, have become a standard tool in many
areas of logic. Even so, discussion has persisted about their technical status,
and perhaps even their conceptual legitimacy. This paper gives a systematic
view of generalized model techniques, discusses what they mean in mathematical
and philosophical terms, and presents a few technical themes and results about
their role in algebraic representation, calibrating provability, lowering
complexity, understanding fixed-point logics, and achieving set-theoretic
absoluteness. We also show how thinking about Henkin's approach to semantics of
logical systems in this generality can yield new results, dispelling the
impression of adhocness. This paper is dedicated to Leon Henkin, a deep
logician who has changed the way we all work, while also being an always open,
modest, and encouraging colleague and friend.Comment: 27 pages. To appear in: The life and work of Leon Henkin: Essays on
his contributions (Studies in Universal Logic) eds: Manzano, M., Sain, I. and
Alonso, E., 201
Compensation methods to support cooperative applications: A case study in automated verification of schema requirements for an advanced transaction model
Compensation plays an important role in advanced transaction models, cooperative work and workflow systems. A schema designer is typically required to supply for each transaction another transaction to semantically undo the effects of . Little attention has been paid to the verification of the desirable properties of such operations, however. This paper demonstrates the use of a higher-order logic theorem prover for verifying that compensating transactions return a database to its original state. It is shown how an OODB schema is translated to the language of the theorem prover so that proofs can be performed on the compensating transactions
- …