11,281 research outputs found

    Introducing the Concept of Activation and Blocking of Rules in the General Framework for Regulated Rewriting in Sequential Grammars

    Get PDF
    We introduce new possibilities to control the application of rules based on the preceding application of rules which can be de ned for a general model of sequential grammars and we show some similarities to other control mechanisms as graph-controlled grammars and matrix grammars with and without applicability checking as well as gram- mars with random context conditions and ordered grammars. Using both activation and blocking of rules, in the string and in the multiset case we can show computational com- pleteness of context-free grammars equipped with the control mechanism of activation and blocking of rules even when using only two nonterminal symbols

    On the Size Complexity of Non-Returning Context-Free PC Grammar Systems

    Get PDF
    Improving the previously known best bound, we show that any recursively enumerable language can be generated with a non-returning parallel communicating (PC) grammar system having six context-free components. We also present a non-returning universal PC grammar system generating unary languages, that is, a system where not only the number of components, but also the number of productions and the number of nonterminals are limited by certain constants, and these size parameters do not depend on the generated language

    Machine Learning and the Cognitive Basis of Natural Language

    Get PDF
    Articl

    Learning morphological phenomena of Modern Greek an exploratory approach

    Get PDF
    This paper presents a computational model for the description of concatenative morphological phenomena of modern Greek (such as inflection, derivation and compounding) to allow learners, trainers and developers to explore linguistic processes through their own constructions in an interactive open‐ended multimedia environment. The proposed model introduces a new language metaphor, the ‘puzzle‐metaphor’ (similar to the existing ‘turtle‐metaphor’ for concepts from mathematics and physics), based on a visualized unification‐like mechanism for pattern matching. The computational implementation of the model can be used for creating environments for learning through design and learning by teaching

    The Unsupervised Acquisition of a Lexicon from Continuous Speech

    Get PDF
    We present an unsupervised learning algorithm that acquires a natural-language lexicon from raw speech. The algorithm is based on the optimal encoding of symbol sequences in an MDL framework, and uses a hierarchical representation of language that overcomes many of the problems that have stymied previous grammar-induction procedures. The forward mapping from symbol sequences to the speech stream is modeled using features based on articulatory gestures. We present results on the acquisition of lexicons and language models from raw speech, text, and phonetic transcripts, and demonstrate that our algorithm compares very favorably to other reported results with respect to segmentation performance and statistical efficiency.Comment: 27 page technical repor

    Geometric representations for minimalist grammars

    Full text link
    We reformulate minimalist grammars as partial functions on term algebras for strings and trees. Using filler/role bindings and tensor product representations, we construct homomorphisms for these data structures into geometric vector spaces. We prove that the structure-building functions as well as simple processors for minimalist languages can be realized by piecewise linear operators in representation space. We also propose harmony, i.e. the distance of an intermediate processing step from the final well-formed state in representation space, as a measure of processing complexity. Finally, we illustrate our findings by means of two particular arithmetic and fractal representations.Comment: 43 pages, 4 figure
    corecore