4,440 research outputs found
Strictly Locally Testable and Resources Restricted Control Languages in Tree-Controlled Grammars
Tree-controlled grammars are context-free grammars where the derivation
process is controlled in such a way that every word on a level of the
derivation tree must belong to a certain control language. We investigate the
generative capacity of such tree-controlled grammars where the control
languages are special regular sets, especially strictly locally testable
languages or languages restricted by resources of the generation (number of
non-terminal symbols or production rules) or acceptance (number of states).
Furthermore, the set theoretic inclusion relations of these subregular language
families themselves are studied.Comment: In Proceedings AFL 2023, arXiv:2309.0112
Generative grammar
Generative Grammar is the label of the most influential research program in linguistics and related fields in the second half of the 20. century. Initiated by a short book, Noam Chomsky's Syntactic Structures (1957), it became one of the driving forces among the disciplines jointly called the cognitive sciences. The term generative grammar refers to an explicit, formal characterization of the (largely implicit) knowledge determining the formal aspect of all kinds of language behavior. The program had a strong mentalist orientation right from the beginning, documented e.g. in a fundamental critique of Skinner's Verbal behavior (1957) by Chomsky (1959), arguing that behaviorist stimulus-response-theories could in no way account for the complexities of ordinary language use. The "Generative Enterprise", as the program was called in 1982, went through a number of stages, each of which was accompanied by discussions of specific problems and consequences within the narrower domain of linguistics as well as the wider range of related fields, such as ontogenetic development, psychology of language use, or biological evolution. Four stages of the Generative Enterprise can be marked off for expository purposes
Linguistics and some aspects of its underlying dynamics
In recent years, central components of a new approach to linguistics, the
Minimalist Program (MP) have come closer to physics. Features of the Minimalist
Program, such as the unconstrained nature of recursive Merge, the operation of
the Labeling Algorithm that only operates at the interface of Narrow Syntax
with the Conceptual-Intentional and the Sensory-Motor interfaces, the
difference between pronounced and un-pronounced copies of elements in a
sentence and the build-up of the Fibonacci sequence in the syntactic derivation
of sentence structures, are directly accessible to representation in terms of
algebraic formalism. Although in our scheme linguistic structures are classical
ones, we find that an interesting and productive isomorphism can be established
between the MP structure, algebraic structures and many-body field theory
opening new avenues of inquiry on the dynamics underlying some central aspects
of linguistics.Comment: 17 page
Implicit learning of recursive context-free grammars
Context-free grammars are fundamental for the description of linguistic syntax. However, most artificial grammar learning
experiments have explored learning of simpler finite-state grammars, while studies exploring context-free grammars have
not assessed awareness and implicitness. This paper explores the implicit learning of context-free grammars employing
features of hierarchical organization, recursive embedding and long-distance dependencies. The grammars also featured
the distinction between left- and right-branching structures, as well as between centre- and tail-embedding, both
distinctions found in natural languages. People acquired unconscious knowledge of relations between grammatical classes
even for dependencies over long distances, in ways that went beyond learning simpler relations (e.g. n-grams) between
individual words. The structural distinctions drawn from linguistics also proved important as performance was greater for
tail-embedding than centre-embedding structures. The results suggest the plausibility of implicit learning of complex
context-free structures, which model some features of natural languages. They support the relevance of artificial grammar
learning for probing mechanisms of language learning and challenge existing theories and computational models of
implicit learning
Interaction Grammars
Interaction Grammar (IG) is a grammatical formalism based on the notion of
polarity. Polarities express the resource sensitivity of natural languages by
modelling the distinction between saturated and unsaturated syntactic
structures. Syntactic composition is represented as a chemical reaction guided
by the saturation of polarities. It is expressed in a model-theoretic framework
where grammars are constraint systems using the notion of tree description and
parsing appears as a process of building tree description models satisfying
criteria of saturation and minimality
Generation of folk song melodies using Bayes transforms
The paper introduces the `Bayes transform', a mathematical procedure for putting data into a hierarchical representation. Applicable to any type of data, the procedure yields interesting results when applied to sequences. In this case, the representation obtained implicitly models the repetition hierarchy of the source. There are then natural applications to music. Derivation of Bayes transforms can be the means of determining the repetition hierarchy of note sequences (melodies) in an empirical and domain-general way. The paper investigates application of this approach to Folk Song, examining the results that can be obtained by treating such transforms as generative models
- …