23 research outputs found

    Merging two Hierarchies of Internal Contextual Grammars with Subregular Selection

    Full text link
    In this paper, we continue the research on the power of contextual grammars with selection languages from subfamilies of the family of regular languages. In the past, two independent hierarchies have been obtained for external and internal contextual grammars, one based on selection languages defined by structural properties (finite, monoidal, nilpotent, combinational, definite, ordered, non-counting, power-separating, suffix-closed, commutative, circular, or union-free languages), the other one based on selection languages defined by resources (number of non-terminal symbols, production rules, or states needed for generating or accepting them). In a previous paper, the language families of these hierarchies for external contextual grammars were compared and the hierarchies merged. In the present paper, we compare the language families of these hierarchies for internal contextual grammars and merge these hierarchies.Comment: In Proceedings NCMA 2023, arXiv:2309.07333. arXiv admin note: text overlap with arXiv:2309.02768, arXiv:2208.1472

    Acta Cybernetica : Volume 22. Number 2.

    Get PDF

    Head-Driven Phrase Structure Grammar

    Get PDF
    Head-Driven Phrase Structure Grammar (HPSG) is a constraint-based or declarative approach to linguistic knowledge, which analyses all descriptive levels (phonology, morphology, syntax, semantics, pragmatics) with feature value pairs, structure sharing, and relational constraints. In syntax it assumes that expressions have a single relatively simple constituent structure. This volume provides a state-of-the-art introduction to the framework. Various chapters discuss basic assumptions and formal foundations, describe the evolution of the framework, and go into the details of the main syntactic phenomena. Further chapters are devoted to non-syntactic levels of description. The book also considers related fields and research areas (gesture, sign languages, computational linguistics) and includes chapters comparing HPSG with other frameworks (Lexical Functional Grammar, Categorial Grammar, Construction Grammar, Dependency Grammar, and Minimalism)

    Head-Driven Phrase Structure Grammar

    Get PDF
    Head-Driven Phrase Structure Grammar (HPSG) is a constraint-based or declarative approach to linguistic knowledge, which analyses all descriptive levels (phonology, morphology, syntax, semantics, pragmatics) with feature value pairs, structure sharing, and relational constraints. In syntax it assumes that expressions have a single relatively simple constituent structure. This volume provides a state-of-the-art introduction to the framework. Various chapters discuss basic assumptions and formal foundations, describe the evolution of the framework, and go into the details of the main syntactic phenomena. Further chapters are devoted to non-syntactic levels of description. The book also considers related fields and research areas (gesture, sign languages, computational linguistics) and includes chapters comparing HPSG with other frameworks (Lexical Functional Grammar, Categorial Grammar, Construction Grammar, Dependency Grammar, and Minimalism)

    Recursion in cognition: a computational investigation into the representation and processing of language

    Get PDF
    La recursividad entendida como auto-referencia se puede aplicar a varios constructos de las ciencias cognitivas, como las definiciones teóricas, los procedimientos mecánicos, los procesos de cálculo (sean éstos abstractos o concretos) o las estructuras. La recursividad es una propiedad central tanto del procedimiento mecánico que subyace a la facultad del lenguaje como de las estructuras que esta facultad genera. Sin embargo, tanto las derivaciones sintácticas de la gramática, que constituyen un proceso computacional abstracto, como las estrategias de procesamiento del parser, que son un proceso en tiempo real, proceden de forma iterativa, lo cual sugiere que la especificación recursiva de un algoritmo se implementa de forma iterativa. Además, la combinación de la recursividad con las unidades léxicas y las imposiciones de los interfaces con los que la facultad del lenguaje interactúa resulta en un conjunto de estructuras sui generis que no tienen parangón en otros dominios cognitivos.Recursion qua self-reference applies to various constructs within the cognitive sciences, such as theoretical definitions, mechanical procedures (or algorithms), (abstract or real-time) computational processes and structures. Recursion is an intrinsic property of both the mechanical procedure underlying the language faculty and the structures this faculty generates. However, the recursive nature of the generated structures and the recursive character of the processes need to be kept distinct, their study meriting individual treatment. In fact, the nature of both the syntactic derivations of the grammar (an abstract computational process) and the processing strategies of the parser (a real-time process) are iterative, which suggests that recursively-defined algorithms are implemented iteratively in linguistic cognition. Furthermore, the combination of recursion, lexical items and the impositions of the interfaces the language faculty interacts with results in a sui generis set of structures with which other domains of the mind bear the most superficial of relations

    CROSS-LINGUISTIC DIFFERENCES IN THE LEARNING OF INFLECTIONAL MORPHOLOGY: EFFECTS OF TARGET LANGUAGE PARADIGM COMPLEXITY

    Get PDF
    Inflectional morphology poses significant difficulty to learners of foreign languages. Multiple approaches have attempted to explain it through one of two lenses. First, inflection has been viewed as one manifestation of syntactic knowledge; its learning has been related to the learning of syntactic structures. Second, the perceptual and semantic properties of the morphemes themselves have been invoked as a cause of difficulty. These groups of accounts presuppose different amounts of abstract knowledge and quite different learning mechanisms. On syntactic accounts, learners possess elaborate architectures of syntactic projections that they use to analyze linguistic input. They do not simply learn morphemes as discrete units in a list—instead, they learn the configurations of feature settings that these morphemes express. On general-cognitive accounts, learners do learn morphemes as units—each with non-zero difficulty and more or less independent of the others. The “more” there is to learn, the worse off the learner. This dissertation paves the way towards integrating the two types of accounts by testing them on cross-linguistic data. This study compares learning rates for languages whose inflectional systems vary in complexity (as reflected in the number of distinct inflectional endings)—German (lowest), Italian (high), and Czech (high, coupled with morpholexical variation). Written learner productions were examined for the accuracy of verbal inflection on dimensions ranging from morphosyntactic (uninflected forms, non-finite forms, use of finite instead of non-finite forms) to morpholexical (errors in root processes, application of wrong verb class templates, or wrong phonemic composition of the root or ending). Error frequencies were modeled using Poisson regression. Complexity affected accuracy differently in different domains of inflection production. Inflectional paradigm complexity was facilitative for learning to supply inflection, and learners of Italian and Czech were not disadvantaged compared to learners of German, despite their paradigms having more distinct elements. However, the complexity of verb class systems and the opacity of morphophonological alternations did result in disadvantages. Learners of Czech misapplied inflectional patterns associated with verb classes more than learners of German; they also failed to recall the correct segments associated with inflections, which resulted in more frequent use of inexistent forms
    corecore