152 research outputs found

    Concurrent Lexicalized Dependency Parsing: A Behavioral View on ParseTalk Events

    Full text link
    The behavioral specification of an object-oriented grammar model is considered. The model is based on full lexicalization, head-orientation via valency constraints and dependency relations, inheritance as a means for non-redundant lexicon specification, and concurrency of computation. The computation model relies upon the actor paradigm, with concurrency entering through asynchronous message passing between actors. In particular, we here elaborate on principles of how the global behavior of a lexically distributed grammar and its corresponding parser can be specified in terms of event type networks and event networks, resp.Comment: 68kB, 5pages Postscrip

    Higher-order Linear Logic Programming of Categorial Deduction

    Full text link
    We show how categorial deduction can be implemented in higher-order (linear) logic programming, thereby realising parsing as deduction for the associative and non-associative Lambek calculi. This provides a method of solution to the parsing problem of Lambek categorial grammar applicable to a variety of its extensions.Comment: 8 pages LaTeX, uses eaclap.sty, to appear EACL9

    Treating Coordination with Datalog Grammars

    Full text link
    In previous work we studied a new type of DCGs, Datalog grammars, which are inspired on database theory. Their efficiency was shown to be better than that of their DCG counterparts under (terminating) OLDT-resolution. In this article we motivate a variant of Datalog grammars which allows us a meta-grammatical treatment of coordination. This treatment improves in some respects over previous work on coordination in logic grammars, although more research is needed for testing it in other respects

    CLiFF Notes: Research in the Language Information and Computation Laboratory of The University of Pennsylvania

    Get PDF
    This report takes its name from the Computational Linguistics Feedback Forum (CLIFF), an informal discussion group for students and faculty. However the scope of the research covered in this report is broader than the title might suggest; this is the yearly report of the LINC Lab, the Language, Information and Computation Laboratory of the University of Pennsylvania. It may at first be hard to see the threads that bind together the work presented here, work by faculty, graduate students and postdocs in the Computer Science, Psychology, and Linguistics Departments, and the Institute for Research in Cognitive Science. It includes prototypical Natural Language fields such as: Combinatorial Categorial Grammars, Tree Adjoining Grammars, syntactic parsing and the syntax-semantics interface; but it extends to statistical methods, plan inference, instruction understanding, intonation, causal reasoning, free word order languages, geometric reasoning, medical informatics, connectionism, and language acquisition. With 48 individual contributors and six projects represented, this is the largest LINC Lab collection to date, and the most diverse

    Principles and Implementation of Deductive Parsing

    Get PDF
    We present a system for generating parsers based directly on the metaphor of parsing as deduction. Parsing algorithms can be represented directly as deduction systems, and a single deduction engine can interpret such deduction systems so as to implement the corresponding parser. The method generalizes easily to parsers for augmented phrase structure formalisms, such as definite-clause grammars and other logic grammar formalisms, and has been used for rapid prototyping of parsing algorithms for a variety of formalisms including variants of tree-adjoining grammars, categorial grammars, and lexicalized context-free grammars.Comment: 69 pages, includes full Prolog cod

    CLiFF Notes: Research in the Language, Information and Computation Laboratory of the University of Pennsylvania

    Get PDF
    One concern of the Computer Graphics Research Lab is in simulating human task behavior and understanding why the visualization of the appearance, capabilities and performance of humans is so challenging. Our research has produced a system, called Jack, for the definition, manipulation, animation and human factors analysis of simulated human figures. Jack permits the envisionment of human motion by interactive specification and simultaneous execution of multiple constraints, and is sensitive to such issues as body shape and size, linkage, and plausible motions. Enhanced control is provided by natural behaviors such as looking, reaching, balancing, lifting, stepping, walking, grasping, and so on. Although intended for highly interactive applications, Jack is a foundation for other research. The very ubiquitousness of other people in our lives poses a tantalizing challenge to the computational modeler: people are at once the most common object around us, and yet the most structurally complex. Their everyday movements are amazingly fluid, yet demanding to reproduce, with actions driven not just mechanically by muscles and bones but also cognitively by beliefs and intentions. Our motor systems manage to learn how to make us move without leaving us the burden or pleasure of knowing how we did it. Likewise we learn how to describe the actions and behaviors of others without consciously struggling with the processes of perception, recognition, and language. Present technology lets us approach human appearance and motion through computer graphics modeling and three dimensional animation, but there is considerable distance to go before purely synthesized figures trick our senses. We seek to build computational models of human like figures which manifest animacy and convincing behavior. Towards this end, we: Create an interactive computer graphics human model; Endow it with reasonable biomechanical properties; Provide it with human like behaviors; Use this simulated figure as an agent to effect changes in its world; Describe and guide its tasks through natural language instructions. There are presently no perfect solutions to any of these problems; ultimately, however, we should be able to give our surrogate human directions that, in conjunction with suitable symbolic reasoning processes, make it appear to behave in a natural, appropriate, and intelligent fashion. Compromises will be essential, due to limits in computation, throughput of display hardware, and demands of real-time interaction, but our algorithms aim to balance the physical device constraints with carefully crafted models, general solutions, and thoughtful organization. The Jack software is built on Silicon Graphics Iris 4D workstations because those systems have 3-D graphics features that greatly aid the process of interacting with highly articulated figures such as the human body. Of course, graphics capabilities themselves do not make a usable system. Our research has therefore focused on software to make the manipulation of a simulated human figure easy for a rather specific user population: human factors design engineers or ergonomics analysts involved in visualizing and assessing human motor performance, fit, reach, view, and other physical tasks in a workplace environment. The software also happens to be quite usable by others, including graduate students and animators. The point, however, is that program design has tried to take into account a wide variety of physical problem oriented tasks, rather than just offer a computer graphics and animation tool for the already computer sophisticated or skilled animator. As an alternative to interactive specification, a simulation system allows a convenient temporal and spatial parallel programming language for behaviors. The Graphics Lab is working with the Natural Language Group to explore the possibility of using natural language instructions, such as those found in assembly or maintenance manuals, to drive the behavior of our animated human agents. (See the CLiFF note entry for the AnimNL group for details.) Even though Jack is under continual development, it has nonetheless already proved to be a substantial computational tool in analyzing human abilities in physical workplaces. It is being applied to actual problems involving space vehicle inhabitants, helicopter pilots, maintenance technicians, foot soldiers, and tractor drivers. This broad range of applications is precisely the target we intended to reach. The general capabilities embedded in Jack attempt to mirror certain aspects of human performance, rather than the specific requirements of the corresponding workplace. We view the Jack system as the basis of a virtual animated agent that can carry out tasks and instructions in a simulated 3D environment. While we have not yet fooled anyone into believing that the Jack figure is real , its behaviors are becoming more reasonable and its repertoire of actions more extensive. When interactive control becomes more labor intensive than natural language instructional control, we will have reached a significant milestone toward an intelligent agent

    CLiFF Notes: Research In Natural Language Processing at the University of Pennsylvania

    Get PDF
    The Computational Linguistics Feedback Forum (CLIFF) is a group of students and faculty who gather once a week to discuss the members\u27 current research. As the word feedback suggests, the group\u27s purpose is the sharing of ideas. The group also promotes interdisciplinary contacts between researchers who share an interest in Cognitive Science. There is no single theme describing the research in Natural Language Processing at Penn. There is work done in CCG, Tree adjoining grammars, intonation, statistical methods, plan inference, instruction understanding, incremental interpretation, language acquisition, syntactic parsing, causal reasoning, free word order languages, ... and many other areas. With this in mind, rather than trying to summarize the varied work currently underway here at Penn, we suggest reading the following abstracts to see how the students and faculty themselves describe their work. Their abstracts illustrate the diversity of interests among the researchers, explain the areas of common interest, and describe some very interesting work in Cognitive Science. This report is a collection of abstracts from both faculty and graduate students in Computer Science, Psychology and Linguistics. We pride ourselves on the close working relations between these groups, as we believe that the communication among the different departments and the ongoing inter-departmental research not only improves the quality of our work, but makes much of that work possible

    COMMITMENT AND FLEXIBILITY IN THE DEVELOPING PARSER

    Get PDF
    This dissertation investigates adults and children's sentence processing mechanisms, with a special focus on how multiple levels of linguistic representation are incrementally computed in real time, and how this process affects the parser's ability to later revise its early commitments. Using cross-methodological and cross-linguistic investigations of long-distance dependency processing, this dissertation demonstrates how paying explicit attention to the procedures by which linguistic representations are computed is vital to understanding both adults' real time linguistic computation and children's reanalysis mechanisms. The first part of the dissertation uses time course evidence from self-paced reading and eye tracking studies (reading and visual world) to show that long-distance dependency processing can be decomposed into a sequence of syntactic and interpretive processes. First, the reading experiments provide evidence that suggests that filler-gap dependencies are constructed before verb information is accessed. Second, visual world experiments show that, in the absence of information that would allow hearers to predict verb content in advance, interpretive processes in filler-gap dependency computation take around 600ms. These results argue for a predictive model of sentence interpretation in which syntactic representations are computed in advance of interpretive processes. The second part of the dissertation capitalizes on this procedural account of filler-gap dependency processing, and reports cross-linguistic studies on children's long-distance dependency processing. Interpretation data from English and Japanese demonstrate that children actively associate a fronted wh-phrase with the first VP in the sentence, and successfully retract such active syntactic commitments when the lack of felicitous interpretation is signaled by verb information, but not when it is signaled by syntactic information. A comparison of the process of anaphor reconstruction in adults and children further suggests that verb-based thematic information is an effective revision cue for children. Finally, distributional analyses of wh-dependencies in child-directed speech are conducted to investigate how parsing constraints impact language acquisition. It is shown that the actual properties of the child parser can skew the input distribution, such that the effective distribution differs drastically from the input distribution seen from a researcher's perspective. This suggests that properties of developing perceptual mechanisms deserve more attention in language acquisition research

    Research in the Language, Information and Computation Laboratory of the University of Pennsylvania

    Get PDF
    This report takes its name from the Computational Linguistics Feedback Forum (CLiFF), an informal discussion group for students and faculty. However the scope of the research covered in this report is broader than the title might suggest; this is the yearly report of the LINC Lab, the Language, Information and Computation Laboratory of the University of Pennsylvania. It may at first be hard to see the threads that bind together the work presented here, work by faculty, graduate students and postdocs in the Computer Science and Linguistics Departments, and the Institute for Research in Cognitive Science. It includes prototypical Natural Language fields such as: Combinatorial Categorial Grammars, Tree Adjoining Grammars, syntactic parsing and the syntax-semantics interface; but it extends to statistical methods, plan inference, instruction understanding, intonation, causal reasoning, free word order languages, geometric reasoning, medical informatics, connectionism, and language acquisition. Naturally, this introduction cannot spell out all the connections between these abstracts; we invite you to explore them on your own. In fact, with this issue it’s easier than ever to do so: this document is accessible on the “information superhighway”. Just call up http://www.cis.upenn.edu/~cliff-group/94/cliffnotes.html In addition, you can find many of the papers referenced in the CLiFF Notes on the net. Most can be obtained by following links from the authors’ abstracts in the web version of this report. The abstracts describe the researchers’ many areas of investigation, explain their shared concerns, and present some interesting work in Cognitive Science. We hope its new online format makes the CLiFF Notes a more useful and interesting guide to Computational Linguistics activity at Penn

    Classification-based phrase structure grammar: an extended revised version of HPSG

    Get PDF
    This thesis is concerned with a presentation of Classification -based Phrase Structure Grammar (or cPSG), a grammatical theory that has grown out of extensive revisions of, and extensions to, HPSG. The fundamental difference between this theory and HPSG concerns the central role that classification plays in the grammar: the grammar classifies strings, according to their feature structure descriptions, as being of various types. Apart from the role of classification, the theory bears a close resemblance to HPSG, though it is by no means a direct translation, including numerous revisions and extensions. A central goal in the development of the theory has been its computational implementation, which is included in the thesis.The presentation may be divided into four parts. In the first, chapters 1 and 2, we present the grammatical formalism within which the theory is stated. This consists of a development of the notion of a classificatory system (chapter 1), and the incorporation of hierarchality into that notion (chapter 2).The second part concerns syntactic issues. Chapter 3 revises the HPSG treatment of specifiers, complements and adjuncts, incorporating ideas that specifiers and complements should be distinguished and presenting a treatment of adjuncts whereby the head is selected for by the adjunct. Chapter 4 presents several options for an account of unbounded dependencies. The accounts are based loosely on that of GPSG, and a reconstruction of GPSG's Foot Feature Principle is presented which does not involve a notion of default. Chapter 5 discusses coordination, employing an extension of Rounds- Kasper logic to allow a treatment of cross -categorial coordination.In the third part, chapters 6, 7 and 8, we turn to semantic issues. We begin (Chapter 6) with a discussion of Situation Theory, the background semantic theory, attempting to establish a precise and coherent version of the theory within which to work. Chapter 7 presents the bulk of the treatment of semantics, and can be seen as an extensive revision of the HPSG treatment of semantics. The aim is to provide a semantic treatment which is faithful to the version of Situation Theory presented in Chapter 6. Chapter 8 deals with quantification, discussing the nature of quantification in Situation Theory before presenting a treatment of quantification in CPSG. Some residual questions about the semantics of coordinated noun phrases are also addressed in this chapter.The final part, Chapter 9, concerns the actual computational implementation of the theory. A parsing algorithm based on hierarchical classification is presented, along with four strategies that might be adopted given that algorithm. Also discussed are some implementation details. A concluding chapter summarises the arguments of the thesis and outlines some avenues for future research
    corecore