333 research outputs found

    Specifying Logic Programs in Controlled Natural Language

    Full text link
    Writing specifications for computer programs is not easy since one has to take into account the disparate conceptual worlds of the application domain and of software development. To bridge this conceptual gap we propose controlled natural language as a declarative and application-specific specification language. Controlled natural language is a subset of natural language that can be accurately and efficiently processed by a computer, but is expressive enough to allow natural usage by non-specialists. Specifications in controlled natural language are automatically translated into Prolog clauses, hence become formal and executable. The translation uses a definite clause grammar (DCG) enhanced by feature structures. Inter-text references of the specification, e.g. anaphora, are resolved with the help of discourse representation theory (DRT). The generated Prolog clauses are added to a knowledge base. We have implemented a prototypical specification system that successfully processes the specification of a simple automated teller machine.Comment: 16 pages, compressed, uuencoded Postscript, published in Proceedings CLNLP 95, COMPULOGNET/ELSNET/EAGLES Workshop on Computational Logic for Natural Language Processing, Edinburgh, April 3-5, 199

    Montague Grammar, Categories and Types: a presentation of actual theories in Semantics and Discourse Interpretation

    Get PDF
    This paper is a presentation of the origins of Montague Grammar in relation to Generative Grammar and Categorical Grammars and its actual application to linguistic models of discourse interpretation. The proposed model is based in Gabbay’s Labeled Deductive Systems, as it was developed by Dov Gabbay and Ruth Kempson in the 90’s

    Natural language semantics and compiler technology

    Get PDF
    This paper recommends an approach to the implementation of semantic representation languages (SRLs) which exploits a parallelism between SRLs and programming languages (PLs). The design requirements of SRLs for natural language are similar to those of PLs in their goals. First, in both cases we seek modules in which both the surface representation (print form) and the underlying data structures are important. This requirement highlights the need for general tools allowing the printing and reading of expressions (data structures). Second, these modules need to cooperate with foreign modules, so that the importance of interface technology (compilation) is paramount; and third, both compilers and semantic modules need "inferential" facilities for transforming (simplifying) complex expressions in order to ease subsequent processing. But the most important parallel is the need in both fields for tools which are useful in combination with a variety of concrete languages -- general purpose parsers, printers, simplifiers (transformation facilities) and compilers. This arises in PL technology from (among other things) the need for experimentation in language design, which is again parallel to the case of SRLs. Using a compiler-based approach, we have implemented NLL, a public domain software package for computational natural language semantics. Several interfaces exist both for grammar modules and for applications, using a variety of interface technologies, including especially compilation. We review here a variety of NLL, applications, focusing on COSMA, an NL interface to a distributed appointment manager

    Categorial Minimalist Grammar: From Generative Syntax To Logical Form

    Get PDF
    International audienceWe first recall some basic notions on minimalist grammars and on categorial grammars. Next we shortly introduce partially commutative linear logic, and our representation of minimalist grammars within this categorial system, the so-called categorial minimalist grammars. Thereafter we briefly present λμ-DRT (Discourse Representation Theory) an extension of λ-DRT (compositional DRT) in the framework of λμ calculus: it avoids type raising and derives different readings from a single semantic representation, in a setting which follows discourse structure. We run a complete example which illustrates the various structures and rules that are needed to derive a semantic representation from the categorial view of a transformational syntactic analysis

    RRL: A Rich Representation Language for the Description of Agent Behaviour in NECA

    Get PDF
    In this paper, we describe the Rich Representation Language (RRL) which is used in the NECA system. The NECA system generates interactions between two or more animated characters. The RRL is a formal framework for representing the information that is exchanged at the interfaces between the various NECA system modules

    Software for Applied Semantics

    Get PDF

    Applying automated deduction to natural language understanding

    Get PDF
    AbstractVery few natural language understanding applications employ methods from automated deduction. This is mainly because (i) a high level of interdisciplinary knowledge is required, (ii) there is a huge gap between formal semantic theory and practical implementation, and (iii) statistical rather than symbolic approaches dominate the current trends in natural language processing. Moreover, abduction rather than deduction is generally viewed as a promising way to apply reasoning in natural language understanding. We describe three applications where we show how first-order theorem proving and finite model construction can efficiently be employed in language understanding.The first is a text understanding system building semantic representations of texts, developed in the late 1990s. Theorem provers are here used to signal inconsistent interpretations and to check whether new contributions to the discourse are informative or not. This application shows that it is feasible to use general-purpose theorem provers for first-order logic, and that it pays off to use a battery of different inference engines as in practice they complement each other in terms of performance.The second application is a spoken-dialogue interface to a mobile robot and an automated home. We use the first-order theorem prover spass for checking inconsistencies and newness of information, but the inference tasks are complemented with the finite model builder mace used in parallel to the prover. The model builder is used to check for satisfiability of the input; in addition, the produced finite and minimal models are used to determine the actions that the robot or automated house has to execute. When the semantic representation of the dialogue as well as the number of objects in the context are kept fairly small, response times are acceptable to human users.The third demonstration of successful use of first-order inference engines comes from the task of recognising entailment between two (short) texts. We run a robust parser producing semantic representations for both texts, and use the theorem prover vampire to check whether one text entails the other. For many examples it is hard to compute the appropriate background knowledge in order to produce a proof, and the model builders mace and paradox are used to estimate the likelihood of an entailment

    The Verbmobil semantic formalism (Version 1.3)

    Get PDF
    This report describes the semantic formalism developed at Saarbrücken University as part of the Verbmobil project. The formalism is based upon DRT with additional functionality to meet the requirements on semantic construction arising from spoken dialogue translation. We define the syntax of the formalism and illustrate the semantic composition process in detail

    Reading Between the Lines: Verifying Mathematical Language

    Get PDF
    A great deal of work has been done on automatically generating automated proofs of formal statements. However, these systems tend to focus on logic-oriented statements and tactics as well as generating proofs in formal language. This project examines proofs written in natural language under a more general scope of mathematics. Furthermore, rather than attempting to generate natural language proofs for the purpose of solving problems, we automatically verify human-written proofs in natural language. To accomplish this, elements of discourse parsing, semantic interpretation, and application of an automated theorem prover are implemented
    corecore