21 research outputs found

    CHR Grammars

    Full text link
    A grammar formalism based upon CHR is proposed analogously to the way Definite Clause Grammars are defined and implemented on top of Prolog. These grammars execute as robust bottom-up parsers with an inherent treatment of ambiguity and a high flexibility to model various linguistic phenomena. The formalism extends previous logic programming based grammars with a form of context-sensitive rules and the possibility to include extra-grammatical hypotheses in both head and body of grammar rules. Among the applications are straightforward implementations of Assumption Grammars and abduction under integrity constraints for language analysis. CHR grammars appear as a powerful tool for specification and implementation of language processors and may be proposed as a new standard for bottom-up grammars in logic programming. To appear in Theory and Practice of Logic Programming (TPLP), 2005Comment: 36 pp. To appear in TPLP, 200

    CHR as grammar formalism. A first report

    Full text link
    Grammars written as Constraint Handling Rules (CHR) can be executed as efficient and robust bottom-up parsers that provide a straightforward, non-backtracking treatment of ambiguity. Abduction with integrity constraints as well as other dynamic hypothesis generation techniques fit naturally into such grammars and are exemplified for anaphora resolution, coordination and text interpretation.Comment: 12 pages. Presented at ERCIM Workshop on Constraints, Prague, Czech Republic, June 18-20, 200

    Constraints and Language

    No full text
    More information on the Publisher's webpage: http://www.cambridgescholars.com/constraints-and-languageInternational audienceThe concept of "constraint" is widely used in linguistics, computer science, and psychology. However, its implementation varies widely depending on the research domain: namely, language description, knowledge representation, cognitive modelling, and problem solving. These various uses of constraints offer complementary views on intelligent mechanisms. For example, in-depth descriptions implementing constraints are used in linguistics to filter out syntactic or discursive structures by means of dedicated description languages and constraint ranking. In computer science, the constraint programming paradigm views constraints as a whole, which can be used, for example, to build specific structures. Finally, in psycholinguistics, experiments are carried out to investigate the role of constraints within cognitive processes (both in comprehension and production), with various applications such as dialog modelling for people with disabilities. In this context, Constraints and Language builds an extended overview of the use of constraints to model and process language

    CHR(PRISM)-based Probabilistic Logic Learning

    Full text link
    PRISM is an extension of Prolog with probabilistic predicates and built-in support for expectation-maximization learning. Constraint Handling Rules (CHR) is a high-level programming language based on multi-headed multiset rewrite rules. In this paper, we introduce a new probabilistic logic formalism, called CHRiSM, based on a combination of CHR and PRISM. It can be used for high-level rapid prototyping of complex statistical models by means of "chance rules". The underlying PRISM system can then be used for several probabilistic inference tasks, including probability computation and parameter learning. We define the CHRiSM language in terms of syntax and operational semantics, and illustrate it with examples. We define the notion of ambiguous programs and define a distribution semantics for unambiguous programs. Next, we describe an implementation of CHRiSM, based on CHR(PRISM). We discuss the relation between CHRiSM and other probabilistic logic programming languages, in particular PCHR. Finally we identify potential application domains

    Logic grammars for diagnosis and repair

    Get PDF

    Contextual graph grammars characterising Rational Graphs

    Get PDF
    International audienceDeterministic graph grammars generate a family of infinite graphs which characterise context-free (word) languages. The present paper introduces a context-sensitive extension of these grammars. We prove that this extension characterises rational graphs (whose traces are context-sensitive languages). We illustrate that this extension is not straightforward: the most obvious context-sensitive graph rewriting systems generate non recursive infinite graphs

    On Language Acquisition through Womb Grammars

    No full text
    International audienceWe propose to automate the field of language acquisition evaluation through Constraint Solving; in particular through the use of Womb Grammars. Womb Grammar Parsing is a novel constraint based paradigm that was devised mainly to induce grammatical structure from the description of its syntactic constraints in a related language. In this paper we argue that it is also ideal for automating the evaluation of language acquisition, and present as proof of concept a CHRG system for detecting which of fourteen levels of morphological proficiency a child is at, from a representative sample of the child's expressions. Our results also uncover ways in which the linguistic constraints that characterize a grammar need to be tailored to language acquisition applications. We also put forward a proposal for discovering in what order such levels are typically acquired in other languages than English. Our findings have great potential practical value, in that they can help educators tailor the games, stories, songs, etc. that can aid a child (or a second language learner) to progress in timely fashion into the next level of proficiency, and can as well help shed light on the processes by which languages less studied than English are acquired

    UsingWomb Grammars for Inducing the Grammar of a Subset of Yorùbá Noun Phrases

    Get PDF
    We address the problem of inducing the grammar of an under-resourced language,Yorùbá, from the grammar of English using an efficient and, linguistically savvy, constraintsolving model of grammar induction –Womb Grammars (WG). Our proposed methodologyadapts WG for parsing a subset of noun phrases of the target language Yorùbá, from thegrammar of the source language English, which is described as properties between pairs ofconstituents. Our model is implemented in CHRG (Constraint Handling Rule Grammar) and,it has been used for inducing the grammar of a useful subset of Yorùbá Noun Phrases. Interestingextensions to the original Womb Grammar model are presented, motivated by the specificneeds of Yorùbá and, similar tone languages

    Domes as a Prodigal Shape in Synthesis-Enhanced Parsers.

    Get PDF
    Abstract. Research on logic based bottom-up parsing -in particular, around Constraint Handling Rule Grammars [3]-is uncovering shape as an untapped fertile ground for natural language processing in general, and for bottom-up parsing and grammar induction in particula
    corecore