4,322 research outputs found

    Geometric representations for minimalist grammars

    Full text link
    We reformulate minimalist grammars as partial functions on term algebras for strings and trees. Using filler/role bindings and tensor product representations, we construct homomorphisms for these data structures into geometric vector spaces. We prove that the structure-building functions as well as simple processors for minimalist languages can be realized by piecewise linear operators in representation space. We also propose harmony, i.e. the distance of an intermediate processing step from the final well-formed state in representation space, as a measure of processing complexity. Finally, we illustrate our findings by means of two particular arithmetic and fractal representations.Comment: 43 pages, 4 figure

    Natural language software registry (second edition)

    Get PDF

    Ontology verbalization in agglutinating Bantu languages: a study of Runyankore and its generalizability

    Get PDF
    Natural Language Generation (NLG) systems have been developed to generate text in multiple domains, including personalized patient information. However, their application is limited in Africa because they generate text in English, yet indigenous languages are still predominantly spoken throughout the continent, especially in rural areas. The existing healthcare NLG systems cannot be reused for Bantu languages due to the complex grammatical structure, nor can the generated text be used in machine translation systems for Bantu languages because they are computationally under-resourced. This research aimed to verbalize ontologies in agglutinating Bantu languages. We had four research objectives: (1) noun pluralization and verb conjugation in Runyankore; (2) Runyankore verbalization patterns for the selected description logic constructors; (3) combining the pluralization, conjugation, and verbalization components to form a Runyankore grammar engine; and (4) generalizing the Runyankore and isiZulu approaches to ontology verbalization to other agglutinating Bantu languages. We used an approach that combines morphology with syntax and semantics to develop a noun pluralizer for Runyankore, and used Context-Free Grammars (CFGs) for verb conjugation. We developed verbalization algorithms for eight constructors in a description logic. We then combined these components into a grammar engine developed as a Protégé5X plugin. The investigation into generalizability used the bootstrap approach, and investigated bootstrapping for languages in the same language zone (intra-zone bootstrappability) and languages across language zones (inter-zone bootstrappability). We obtained verbalization patterns for Luganda and isiXhosa, in the same zones as Runyankore and isiZulu respectively, and chiShona, Kikuyu, and Kinyarwanda from different zones, and used the bootstrap metric that we developed to identify the most efficient source—target bootstrap pair. By regrouping Meinhof’s noun class system we were able to eliminate non-determinism during computation, and this led to the development of a generic noun pluralizer. We also showed that CFGs can conjugate verbs in the five additional languages. Finally, we proposed the architecture for an API that could be used to generate text in agglutinating Bantu languages. Our research provides a method for surface realization for an under-resourced and grammatically complex family of languages, Bantu languages. We leave the development of a complete NLG system based on the Runyankore grammar engine and of the API as areas for future work

    An analysis of the application of AI to the development of intelligent aids for flight crew tasks

    Get PDF
    This report presents the results of a study aimed at developing a basis for applying artificial intelligence to the flight deck environment of commercial transport aircraft. In particular, the study was comprised of four tasks: (1) analysis of flight crew tasks, (2) survey of the state-of-the-art of relevant artificial intelligence areas, (3) identification of human factors issues relevant to intelligent cockpit aids, and (4) identification of artificial intelligence areas requiring further research

    A workshop on the gathering of information for problem formulation

    Get PDF
    Issued as Quarterly progress reports no. [1-5], Proceedings and Final contract report, Project no. G-36-651Papers presented at the Workshop/Symposium on Human Computer Interaction, March 26 and 27, 1981, Atlanta, G

    Head-Driven Phrase Structure Grammar

    Get PDF
    Head-Driven Phrase Structure Grammar (HPSG) is a constraint-based or declarative approach to linguistic knowledge, which analyses all descriptive levels (phonology, morphology, syntax, semantics, pragmatics) with feature value pairs, structure sharing, and relational constraints. In syntax it assumes that expressions have a single relatively simple constituent structure. This volume provides a state-of-the-art introduction to the framework. Various chapters discuss basic assumptions and formal foundations, describe the evolution of the framework, and go into the details of the main syntactic phenomena. Further chapters are devoted to non-syntactic levels of description. The book also considers related fields and research areas (gesture, sign languages, computational linguistics) and includes chapters comparing HPSG with other frameworks (Lexical Functional Grammar, Categorial Grammar, Construction Grammar, Dependency Grammar, and Minimalism)

    Implicit learning of natural language syntax

    Get PDF
    The present dissertation focuses on the question of how humans acquire syntactic knowledge without intending to and without awareness of what they have learned. The aim is to apply the theoretical concepts and the methodological framework provided by implicit learning research to the investigation of language acquisition. The results of six experiments are reported. In terms of design, all experiments consisted of (i) a training phase, during which subjects were trained on a miniature linguistic system by means of different exposure conditions, (ii) an unexpected testing phase, during which learning and awareness were assessed, and (iii) a debriefing session. A semi-artificial grammar, which consisted of English words and German syntax, was employed to generate the stimulus material for experiments 1, 2, 3, 5 and 6; in the case of experiment 4, nonsense syllables were used instead of English words. The linguistic focus was on verb placement rules. Native speakers of English with no background in German (or any other V2 language) were recruited to take part in the experiments.Participants in experiments 1-5 were exposed to the semi-artificial system under incidental learning conditions by means of different training tasks. In experiments 1 and 2, an auditory plausibility judgment task was used to expose participants to the stimulus sentences. In experiment 3, elicited imitations were used in addition to the plausibility judgment task. The training phase in experiment 4 consisted solely of elicited imitations, while training in experiment 5 consisted of a classification task which required participants to identify the syntactic structure of each stimulus item, followed by plausibility judgments. Participants in experiment 6, on the other hand, were exposed to the semi-artificial grammar under intentional learning conditions. These participants were told that the word order of the stimulus sentences was governed by a complex rule-system and instructed to discover syntactic rules. After training, participants in all six experiments took part in a testing phase which assessed whether learning took place and to what extent they became aware of the knowledge they had acquired. Grammaticality judgments were used as a measure of learning. Awareness was assessed by means of verbal reports, accuracy estimates, confidence ratings and source attributions. Control participants did not take part in the training phase.The results of the experiments indicate that adult learners are able to acquire syntactic structures of a novel language under both incidental and intentional learning conditions, while processing sentences for meaning, without the benefit of corrective feedback and after short vi exposure periods. That is, the findings demonstrate that the implicit learning of natural language is not restricted to infants and child learners. In addition, the experiments also show that subjects are able to transfer their knowledge to stimuli with the same underlying structure but new surface features. The measures of awareness further suggest that, in experiments 3 to 6 at least, learning resulted in both conscious and unconscious knowledge. While subjects did not become aware of all the information they have acquired, it was clear that higher levels of awareness were associated with improved performance.The findings reported in this dissertation have several implications for our understanding of language acquisition and for future research. Firstly, while the precise form of the knowledge acquired in these experiments is unclear, the findings provided no evidence for rule learning in the vast majority of subjects. It suggests that subjects in these types of experiments (and perhaps in natural language acquisition) do not acquire linguistic rules. The results support Shanks (1995; Johnstone & Shanks, 2001), who argues against the possibility of implicit rule learning. Secondly, while adults can acquire knowledge implicitly, the work reported in this dissertation also demonstrates that adult syntactic learning results predominantly in a conscious (but largely unverbalizable) knowledge base. Finally, from a methodological perspective, the results of the experiments confirm that relying on verbal reports as a measure of awareness is not sufficient. The verbal reports collected at the end of the experiment were helpful in determining what aspects of the semi-artificial grammar subjects had consciously noticed. At the same time, verbal reports were clearly not sensitive enough to assess whether subjects were aware of the knowledge they had acquired. Confidence ratings and source attributions provided a very useful method for capturing low levels of awareness and to observe the conscious status of both structural and judgment knowledge. Future experiments on language acquisition would benefit from the introduction of this relatively simple, but effective way of assessing awareness
    • …
    corecore