687 research outputs found

    TDL : a type description language for HPSG. - Part 1: Overview

    Get PDF
    Unification-based grammar formalisms have become the predominant paradigm in natural language processing NLP and computational linguistics CL. Their success stems from the fact that they can be seen as high-level declarative programming languages for linguists, which allow them to express linguistic knowledge in a monotonic fashion. More over, such formalisms can be given a precise set theoretical semantics. This paper presents mathcal{TDL}, a typed featurebased language and inference system, which is specically designed to support highly lexicalized grammar theories like HPSG, FUG, or CUG. mathcal{TDL} allows the user to define possibly recursive hierarchically ordered types consisting of type constraints and feature constraints over the boolean connectives wedge, vee, and neg. mathcal{TDL} distinguishes between avm types (open-world reasoning), sort types (closed-world reasoning), built-in types and atoms, and allows the declaration of partitions and incompatible types. Working with partially as well as with fully expanded types is possible, both at definition time and at run time. mathcal{TDL} is incremental, i.e., it allows the redefinition of types and the use of undefined types. Efficient reasoning is accomplished through four specialized reasoners

    Efficient Constraints on Possible Worlds for Reasoning About Necessity

    Get PDF
    Modal logics offer natural, declarative representations for describing both the modular structure of logical specifications and the attitudes and behaviors of agents. The results of this paper further the goal of building practical, efficient reasoning systems using modal logics. The key problem in modal deduction is reasoning about the world in a model (or scope in a proof) at which an inference rule is applied - a potentially hard problem. This paper investigates the use of partial-order mechanisms to maintain constraints on the application of modal rules in proof search in restricted languages. The main result is a simple, incremental polynomial-time algorithm to correctly order rules in proof trees for combinations of K, K4, T and S4 necessity operators governed by a variety of interactions, assuming an encoding of negation using a scoped constant ⊥. This contrasts with previous equational unification methods, which have exponential performance in general because they simply guess among possible intercalations of modal operators. The new, fast algorithm is appropriate for use in a wide variety of applications of modal logic, from planning to logic programming

    Efficient Constraints on Possible Worlds for Reasoning about Necessity

    Get PDF
    Modal logics offer natural, declarative representations for describing both the modular structure of logical specifications and the attitudes and behaviors of agents. The results of this paper further the goal of building practical, efficient reasoning systems using modal logics. The key problem in modal deduction is reasoning about the world in a model (or scope in a proof) at which an inference rule is applied—a potentially hard problem. This paper investigates the use of partial-order mechanisms to maintain constraints on the application of modal rules in proof search in restricted languages. The main result is a simple, incremental polynomial-time algorithm to correctly order rules in proof trees for combinations of K, K4, T and S4 necessity operators governed by a variety of interactions, assuming an encoding of negation using a scoped constant ┴. This contrasts with previous equational unification methods, which have exponential performance in general because they simply guess among possible intercalations of modal operators. The new, fast algorithm is appropriate for use in a wide variety of applications of modal logic, from planning to logic programming

    The generation of concurrent code for declarative languages

    Get PDF
    PhD ThesisThis thesis presents an approach to the implementation of declarative languages on a simple, general purpose concurrent architecture. The safe exploitation of the available concurrency is managed by relatively sophisticated code generation techniques to transform programs into an intermediate concurrent machine code. Compilation techniques are discussed for 1'-HYBRID, a strongly typed applicative language, and for 'c-HYBRID, a concurrent, nondeterministic logic language. An approach is presented for 1'- HYBRID whereby the style of programming influences the concurrency utilised when a program executes. Code transformation techniques are presented which generalise tail-recursion optimisation, allowing many recursive functions to be modelled by perpetual processes. A scheme is also presented to allow parallelism to be increased by the use of local declarations, and constrained by the use of special forms of identity function. In order to preserve determinism in the language, a novel fault handling mechanism is used, whereby exceptions generated at run-time are treated as a special class of values within the language. A description is given of ,C-HYBRID, a dialect of the nondeterministic logic language Concurrent Prolog. The language is embedded within the applicative language 1'-HYBRID, yielding a combined applicative and logic programming language. Various cross-calling techniques are described, including the use of applicative scoping rules to allow local logical assertions. A description is given of a polymorphic typechecking algorithm for logic programs, which allows different instances of clauses to unify objects of different types. The concept of a method is derived to allow unification Information to be passed as an implicit argument to clauses which require it. In addition, the typechecking algorithm permits higher-order objects such as functions to be passed within arguments to clauses. Using Concurrent Prolog's model of concurrency, techniques are described which permit compilation of 'c-HYBRID programs to abstract machine code derived from that used for the applicative language. The use of methods allows polymorphic logic programs to execute without the need for run-time type information in data structures.The Science and Engineering Research Council

    Towards Intelligent Databases

    Get PDF
    This article is a presentation of the objectives and techniques of deductive databases. The deductive approach to databases aims at extending with intensional definitions other database paradigms that describe applications extensionaUy. We first show how constructive specifications can be expressed with deduction rules, and how normative conditions can be defined using integrity constraints. We outline the principles of bottom-up and top-down query answering procedures and present the techniques used for integrity checking. We then argue that it is often desirable to manage with a database system not only database applications, but also specifications of system components. We present such meta-level specifications and discuss their advantages over conventional approaches

    Parsing Schemata

    Get PDF
    Parsing schemata provide a general framework for specication, analysis and comparison of (sequential and/or parallel) parsing algorithms. A grammar specifies implicitly what the valid parses of a sentence are; a parsing algorithm specifies explicitly how to compute these. Parsing schemata form a well-defined level of abstraction in between grammars and parsing algorithms. A parsing schema specifies the types of intermediate results that can be computed by a parser, and the rules that allow to expand a given set of such results with new results. A parsing schema does not specify the data structures, control structures, and (in case of parallel processing)\ud communication structures that are to be used by a parser.\ud Part I, Exposition, gives a general introduction to the ideas that are worked out in the following parts.\ud Part II, Foundation, unfolds a mathematical theory of parsing schemata. Different kinds of relations between parsing schemata are formally introduced and illustrated with examples drawn from the parsing literature.\ud Part III, Application, discusses a series of applications of parsing schemata.\ud - Feature percolation in unification grammar parsing can be described in an elegant, legible notation.\ud - Because of the absence of algorithmic detail, parsing schemata can be used to get a formal grip on highly complicated algorithms. We give substance to this claim by means of a thorough analysis of Left-Corner and Head-Corner chart parsing.\ud - As an example of structural similarity of parsers, despite differences in form and appearance, we show that the underlying parsing schemata of Earley's algorithm and Tomita's algorithm are virtually identical. Using this structural correspondence we can obtain a novel parallel parser by cross-fertilizing a parallel Earley parser with Tomita's graph-structured stack.\ud - Parsing schemata can be implemented straightforwardly by boolean circuits. This means that, in principle, parsing schemata can be coded directly into hardware.\ud Part IV, Perspective, discusses the prospects for natural language parsing applications and draws some conclusions. An important observation is that the theoretical and practical part of the book reinforce each other. The proposed framework is abstract enough to allow a thorough mathematical treatment and practical enough to allow rewriting a variety of real parsing algorithms (i.e. seriously proposed in the literature, not toy examples)\ud in a clear and coherent way

    Typed feature formalisms as a common basis for linguistic specification

    Get PDF
    Typed feature formalisms (TFF) play an increasingly important role in CL and NLP. Many of these systems are inspired by Pollard and Sag\u27s work on Head-Driven Phrase Structure Grammar (HPSG), which has shown that a great deal of syntax and semantics can be neatly encoded within TFF. However, syntax and semantics are not the only areas in which TFF can be beneficially employed. In this paper, I will show that TFF can also be used as a means to model finite automata (FA) and to perform certain types of logical inferencing. In particular, I will (i) describe how FA can be defined and processed within TFF and (ii) propose a conservative extension to HPSG, which allows for a restricted form of semantic processing within TFF, so that the construction of syntax and semantics can be intertwined with the simplification of the logical form of an utterance. The approach which I propose provides a uniform, HPSG-oriented framework for different levels of linguistic processing, including allomorphy and morphotactics, syntax, semantics, and logical form simplification
    corecore