53 research outputs found
A Constructive Approach to Intensional Contexts : Remarks on the Metaphysics of Model Theory
The basic distinction between extensional and intensional contexts is one of different
denotation conditions (truth conditions). This difference in denotation conditions has been related to a fundamental question of semantic theory, namely: what do expressions of natural 'language denote? Most authors assume that in ex tensional contexts expressions denote 'real objects.' Since the rules of substitutivity of identicals and existential generalization do not hold in intensional contexts (cf. section 1), they are thus forced to postulate that in intensional contexts expressions denote something else. Frege, for example, assumes a de notational ambiguity between 'Bedeutung' and 'Sinn', Russell between 'primary occurrences' and 'secondary occurrences', Quine between 'proper occurrences' and 'accidental occurrences', and Montague between 'extensions' and 'intensions
Constraining Montague Grammar for computational applications
This work develops efficient methods for the implementation of Montague Grammar on
a computer. It covers both the syntactic and the semantic aspects of that task. Using a
simplified but adequate version of Montague Grammar it is shown how to translate from
an English fragment to a purely extensional first-order language which can then be made
amenable to standard automatic theorem-proving techniques.
Translating a sentence of Montague English into the first-order predicate calculus
usually proceeds via an intermediate translation in the typed lambda calculus which is
then simplified by lambda-reduction to obtain a first-order equivalent. If sufficient sortal
structure underlies the type theory for the reduced translation to always be a first-order
one then perhaps it should be directly constructed during the syntactic analysis of the
sentence so that the lambda-expressions never come into existence and no further
processing is necessary. A method is proposed to achieve this involving the unification
of meta-logical expressions which flesh out the type symbols of Montague's type theory
with first-order schemas.
It is then shown how to implement Montague Semantics without using a theorem prover
for type theory. Nothing more than a theorem prover for the first-order predicate
calculus is required. The first-order system can be used directly without encoding the
whole of type theory. It is only necessary to encode a part of second-order logic and
this can be done in an efficient, succinct, and readable manner. Furthermore the
pseudo-second-order terms need never appear in any translations provided by the parser.
They are vital just when higher-order reasoning must be simulated.
The foundation of this approach is its five-sorted theory of Montague Semantics. The
objects in this theory are entities, indices, propositions, properties, and quantities. It is a
theory which can be expressed in the language of first-order logic by means of axiom
schemas and there is a finite second-order axiomatisation which is the basis for the
theorem-proving arrangement. It can be viewed as a very constrained set theory
Logical ambiguity
The thesis presents research in the field of model theoretic semantics on the problem of ambiguity,
especially as it arises for sentences that contain junctions (and,or) and quantifiers (every man,
a woman). A number of techniques that have been proposed are surveyed, and I conclude
that these ought to be rejected because they do not make ambiguity 'emergent': they all have
the feature that subtheories would be able to explain all syntactic facts yet would predict no
ambiguity. In other words these accounts have a special purpose mechanism for generating
ambiguities.It is argued that categorial grammars show promise for giving an 'emergent' account. This is
because the only way to take a subtheory of a particular categorial grammar is by changing one
of the small number of clauses by which the categorial grammar axiomatises an infinite set of
syntactic rules, and such a change is likely to have a wider range of effects on the coverage of
the grammar than simply the subtraction of ambiguity.Of categorial grammars proposed to date the most powerful is Lambek Categorial Grammar,
which defines the set of syntactic rules by a notational variant of Gentzen's sequent calculus
for implicational propositional logic, and which defines meaning assignment by using the Curry-
Howard isomorphism between Natural Deduction proofs in implicational propositional logic and
terms of typed lambda calculus. It is shown that no satisfactory account of the junctions and
quantifiers is possible in Lambek categorial grammar.I introduce then a framework that I call Polymorphic Lambek Categorial Grammar, which adds
variables and their universal quantification, to the language of categorisation. The set of syntac¬
tic rules is specified by a notational variant of Gentzen's sequent calculus for quantified proposi¬
tional logic, and which defines meaning assignment by using Girard's Extended Curry-Howard
isomorphism between Natural Deduction proofs in quantified implicational propositional logic
and terms of 2nd order polymorphic lambda calculus. It is shown that this allows an account
of the junctions and quantifiers, and one which is 'emerg
New Equations for Neutral Terms: A Sound and Complete Decision Procedure, Formalized
The definitional equality of an intensional type theory is its test of type
compatibility. Today's systems rely on ordinary evaluation semantics to compare
expressions in types, frustrating users with type errors arising when
evaluation fails to identify two `obviously' equal terms. If only the machine
could decide a richer theory! We propose a way to decide theories which
supplement evaluation with `-rules', rearranging the neutral parts of
normal forms, and report a successful initial experiment.
We study a simple -calculus with primitive fold, map and append operations on
lists and develop in Agda a sound and complete decision procedure for an
equational theory enriched with monoid, functor and fusion laws
Transpension: The Right Adjoint to the Pi-type
Presheaf models of dependent type theory have been successfully applied to
model HoTT, parametricity, and directed, guarded and nominal type theory. There
has been considerable interest in internalizing aspects of these presheaf
models, either to make the resulting language more expressive, or in order to
carry out further reasoning internally, allowing greater abstraction and
sometimes automated verification. While the constructions of presheaf models
largely follow a common pattern, approaches towards internalization do not.
Throughout the literature, various internal presheaf operators (,
, , , ,
, the strictness axiom and locally fresh names) can be found and
little is known about their relative expressivenes. Moreover, some of these
require that variables whose type is a shape (representable presheaf, e.g. an
interval) be used affinely.
We propose a novel type former, the transpension type, which is right adjoint
to universal quantification over a shape. Its structure resembles a dependent
version of the suspension type in HoTT. We give general typing rules and a
presheaf semantics in terms of base category functors dubbed multipliers.
Structural rules for shape variables and certain aspects of the transpension
type depend on characteristics of the multiplier. We demonstrate how the
transpension type and the strictness axiom can be combined to implement all and
improve some of the aforementioned internalization operators (without formal
claim in the case of locally fresh names)
Introducing Continuations
This working paper introduces CONTINUATIONS (a concept borrowed from computer science) as a new technique for characterizing certain aspects of the semantics of a natural language. I should emphasize at the outset that this is just an introduction, and that more a rigorous and thorough treatment is under development (see Barker (ms)). In the meantime, this paper mentions certain formal results without proving them, and describes certain new empirical generalizations without exploring them. What it will do is provide an explicit account of a range of familiar phenomena related to quantification, including quantifier scope ambiguity, NP as a scope island, and generalized coordination. What makes the account noteworthy is that it provides a fully and strictly compositional analysis of quantification and generalized coordination that does not rely on syntactic movement operations such as Quantifier Movement, auxiliary storage mechanisms such as Cooper Storage, or type ambiguity as in Hendriks' Flexible Types system
- …