974 research outputs found

    Incremental semantics and interactive syntactic processing

    Get PDF

    Gapping as Constituent Coordination

    Get PDF
    A number of coordinate constructions in natural languages conjoin sequences which do not appear to correspond to syntactic constituents in the traditional sense. One striking instance of the phenomenon is afforded by the gapping construction of English, of which the following sentence is a simple example: (1) Harry eats beans, and Fred, potatoes Since all theories agree that coordination must in fact be an operation upon constituents, most of them have dealt with the apparent paradox presented by such constructions by supposing that such sequences as the right conjunct in the above example, Fred, potatoes, should be treated in the grammar as traditional constituents, of type S, but with pieces missing or deleted

    Categorical combinators

    Get PDF
    Our main aim is to present the connection between λ-calculus and Cartesian closed categories both in an untyped and purely syntactic setting. More specifically we establish a syntactic equivalence theorem between what we call categorical combinatory logic and λ-calculus with explicit products and projections, with β and η-rules as well as with surjective pairing. “Combinatory logic” is of course inspired by Curry's combinatory logic, based on the well-known S, K, I. Our combinatory logic is “categorical” because its combinators and rules are obtained by extracting untyped information from Cartesian closed categories (looking at arrows only, thus forgetting about objects). Compiling λ-calculus into these combinators happens to be natural and provokes only n log n code expansion. Moreover categorical combinatory logic is entirely faithful to β-reduction where combinatory logic needs additional rather complex and unnatural axioms to be. The connection easily extends to the corresponding typed calculi, where typed categorical combinatory logic is a free Cartesian closed category where the notion of terminal object is replaced by the explicit manipulation of applying (a function to its argument) and coupling (arguments to build datas in products). Our syntactic equivalences induce equivalences at the model level. The paper is intended as a mathematical foundation for developing implementations of functional programming languages based on a “categorical abstract machine,” as developed in a companion paper (Cousineau, Curien, and Mauny, in “Proceedings, ACM Conf. on Functional Programming Languages and Computer Architecture,” Nancy, 1985)

    Type-driven semantic interpretation and feature dependencies in R-LFG

    Full text link
    Once one has enriched LFG's formal machinery with the linear logic mechanisms needed for semantic interpretation as proposed by Dalrymple et. al., it is natural to ask whether these make any existing components of LFG redundant. As Dalrymple and her colleagues note, LFG's f-structure completeness and coherence constraints fall out as a by-product of the linear logic machinery they propose for semantic interpretation, thus making those f-structure mechanisms redundant. Given that linear logic machinery or something like it is independently needed for semantic interpretation, it seems reasonable to explore the extent to which it is capable of handling feature structure constraints as well. R-LFG represents the extreme position that all linguistically required feature structure dependencies can be captured by the resource-accounting machinery of a linear or similiar logic independently needed for semantic interpretation, making LFG's unification machinery redundant. The goal is to show that LFG linguistic analyses can be expressed as clearly and perspicuously using the smaller set of mechanisms of R-LFG as they can using the much larger set of unification-based mechanisms in LFG: if this is the case then we will have shown that positing these extra f-structure mechanisms is not linguistically warranted.Comment: 30 pages, to appear in the the ``Glue Language'' volume edited by Dalrymple, uses tree-dvips, ipa, epic, eepic, fullnam

    Cyclic Datatypes modulo Bisimulation based on Second-Order Algebraic Theories

    Full text link
    Cyclic data structures, such as cyclic lists, in functional programming are tricky to handle because of their cyclicity. This paper presents an investigation of categorical, algebraic, and computational foundations of cyclic datatypes. Our framework of cyclic datatypes is based on second-order algebraic theories of Fiore et al., which give a uniform setting for syntax, types, and computation rules for describing and reasoning about cyclic datatypes. We extract the "fold" computation rules from the categorical semantics based on iteration categories of Bloom and Esik. Thereby, the rules are correct by construction. We prove strong normalisation using the General Schema criterion for second-order computation rules. Rather than the fixed point law, we particularly choose Bekic law for computation, which is a key to obtaining strong normalisation. We also prove the property of "Church-Rosser modulo bisimulation" for the computation rules. Combining these results, we have a remarkable decidability result of the equational theory of cyclic data and fold.Comment: 38 page

    Discriminating Lambda-Terms Using Clocked Boehm Trees

    Full text link
    As observed by Intrigila, there are hardly techniques available in the lambda-calculus to prove that two lambda-terms are not beta-convertible. Techniques employing the usual Boehm Trees are inadequate when we deal with terms having the same Boehm Tree (BT). This is the case in particular for fixed point combinators, as they all have the same BT. Another interesting equation, whose consideration was suggested by Scott, is BY = BYS, an equation valid in the classical model P-omega of lambda-calculus, and hence valid with respect to BT-equality but nevertheless the terms are beta-inconvertible. To prove such beta-inconvertibilities, we employ `clocked' BT's, with annotations that convey information of the tempo in which the data in the BT are produced. Boehm Trees are thus enriched with an intrinsic clock behaviour, leading to a refined discrimination method for lambda-terms. The corresponding equality is strictly intermediate between beta-convertibility and Boehm Tree equality, the equality in the model P-omega. An analogous approach pertains to Levy-Longo and Berarducci Trees. Our refined Boehm Trees find in particular an application in beta-discriminating fixed point combinators (fpc's). It turns out that Scott's equation BY = BYS is the key to unlocking a plethora of fpc's, generated by a variety of production schemes of which the simplest was found by Boehm, stating that new fpc's are obtained by postfixing the term SI, also known as Smullyan's Owl. We prove that all these newly generated fpc's are indeed new, by considering their clocked BT's. Even so, not all pairs of new fpc's can be discriminated this way. For that purpose we increase the discrimination power by a precision of the clock notion that we call `atomic clock'.Comment: arXiv admin note: substantial text overlap with arXiv:1002.257

    Kolmogorov Complexity in perspective. Part II: Classification, Information Processing and Duality

    Get PDF
    We survey diverse approaches to the notion of information: from Shannon entropy to Kolmogorov complexity. Two of the main applications of Kolmogorov complexity are presented: randomness and classification. The survey is divided in two parts published in a same volume. Part II is dedicated to the relation between logic and information system, within the scope of Kolmogorov algorithmic information theory. We present a recent application of Kolmogorov complexity: classification using compression, an idea with provocative implementation by authors such as Bennett, Vitanyi and Cilibrasi. This stresses how Kolmogorov complexity, besides being a foundation to randomness, is also related to classification. Another approach to classification is also considered: the so-called "Google classification". It uses another original and attractive idea which is connected to the classification using compression and to Kolmogorov complexity from a conceptual point of view. We present and unify these different approaches to classification in terms of Bottom-Up versus Top-Down operational modes, of which we point the fundamental principles and the underlying duality. We look at the way these two dual modes are used in different approaches to information system, particularly the relational model for database introduced by Codd in the 70's. This allows to point out diverse forms of a fundamental duality. These operational modes are also reinterpreted in the context of the comprehension schema of axiomatic set theory ZF. This leads us to develop how Kolmogorov's complexity is linked to intensionality, abstraction, classification and information system.Comment: 43 page

    Categorial Grammar

    Get PDF
    The paper is a review article comparing a number of approaches to natural language syntax and semantics that have been developed using categorial frameworks. It distinguishes two related but distinct varieties of categorial theory, one related to Natural Deduction systems and the axiomatic calculi of Lambek, and another which involves more specialized combinatory operations

    Grammars and Processors

    Get PDF
    The paper discusses the role of grammars in sentence processing, and explores some consequences of the Strong Competence Hypothesis of Bresnan and Kaplan for combinatory theories of grammar

    Type-driven natural language analysis

    Get PDF
    The purpose of this thesis is in showing how recent developments in logic programming can be exploited to encode in a computational environment the features of certain linguistic theories. We are in this way able to make available for the purpose of natural language processing sophisticated capabilities of linguistic analysis directly justified by well developed grammatical frameworks. More specifically, we exploit hypothetical reasoning, recently proposed as one of the possible directions to widen logic programming, to account for the syntax of filler-gap dependencies along the lines of linguistic theories such as Generalized Phrase Structure Grammar and Categorial Grammar. Moreover, we make use, for the purpose of semantic analysis of the same kind of phenomena, of another recently proposed extension, interestingly related to the previous one, namely the idea of replacing first-order terms with the more expressive λ-terms of λ-Calculus
    corecore