39,848 research outputs found
Implicit learning of recursive context-free grammars
Context-free grammars are fundamental for the description of linguistic syntax. However, most artificial grammar learning
experiments have explored learning of simpler finite-state grammars, while studies exploring context-free grammars have
not assessed awareness and implicitness. This paper explores the implicit learning of context-free grammars employing
features of hierarchical organization, recursive embedding and long-distance dependencies. The grammars also featured
the distinction between left- and right-branching structures, as well as between centre- and tail-embedding, both
distinctions found in natural languages. People acquired unconscious knowledge of relations between grammatical classes
even for dependencies over long distances, in ways that went beyond learning simpler relations (e.g. n-grams) between
individual words. The structural distinctions drawn from linguistics also proved important as performance was greater for
tail-embedding than centre-embedding structures. The results suggest the plausibility of implicit learning of complex
context-free structures, which model some features of natural languages. They support the relevance of artificial grammar
learning for probing mechanisms of language learning and challenge existing theories and computational models of
implicit learning
Algebraic properties of structured context-free languages: old approaches and novel developments
The historical research line on the algebraic properties of structured CF
languages initiated by McNaughton's Parenthesis Languages has recently
attracted much renewed interest with the Balanced Languages, the Visibly
Pushdown Automata languages (VPDA), the Synchronized Languages, and the
Height-deterministic ones. Such families preserve to a varying degree the basic
algebraic properties of Regular languages: boolean closure, closure under
reversal, under concatenation, and Kleene star. We prove that the VPDA family
is strictly contained within the Floyd Grammars (FG) family historically known
as operator precedence. Languages over the same precedence matrix are known to
be closed under boolean operations, and are recognized by a machine whose pop
or push operations on the stack are purely determined by terminal letters. We
characterize VPDA's as the subclass of FG having a peculiarly structured set of
precedence relations, and balanced grammars as a further restricted case. The
non-counting invariance property of FG has a direct implication for VPDA too.Comment: Extended version of paper presented at WORDS2009, Salerno,Italy,
September 200
Transformational classes of grammars
Given two Chomsky grammars G and \-G, a homomorphism φ from G to \-G is, roughly speaking, a map which assigns to every derivation of G a derivation of \-G in such a manner that φ is uniquely determined by its restriction to the set of productions of G. Two grammars are contained in the same transformational class, if the one can be transformed into the other by a sequence of homomorphisms. If two grammars are related in such a manner, then there are two relations, one concerning the words of the languages generated and the other regarding the derivations of these words. We establish several classifications of context-free grammars in transformational classes which are recursively solvable
Higher-Order Operator Precedence Languages
Floyd's Operator Precedence (OP) languages are a deterministic context-free
family having many desirable properties. They are locally and parallely
parsable, and languages having a compatible structure are closed under Boolean
operations, concatenation and star; they properly include the family of Visibly
Pushdown (or Input Driven) languages. OP languages are based on three relations
between any two consecutive terminal symbols, which assign syntax structure to
words. We extend such relations to k-tuples of consecutive terminal symbols, by
using the model of strictly locally testable regular languages of order k at
least 3. The new corresponding class of Higher-order Operator Precedence
languages (HOP) properly includes the OP languages, and it is still included in
the deterministic (also in reverse) context free family. We prove Boolean
closure for each subfamily of structurally compatible HOP languages. In each
subfamily, the top language is called max-language. We show that such languages
are defined by a simple cancellation rule and we prove several properties, in
particular that max-languages make an infinite hierarchy ordered by parameter
k. HOP languages are a candidate for replacing OP languages in the various
applications where they have have been successful though sometimes too
restrictive.Comment: In Proceedings AFL 2017, arXiv:1708.0622
Recommended from our members
Evaluative conditioning of artificial grammars: evidence that subjectively-unconscious structures bias affective evaluations of novel stimuli
Evaluative conditioning (EC) refers to the acquisition of emotional valence by an initially-neutral stimulus (conditioned stimulus; CS), after being paired with an emotional stimulus (unconditioned stimulus; US). An important issue regards whether, when participants are unaware of the CS-US contingency, the affective valence can generalize to new stimuli that share similarities with the CS. Previous studies have shown that generalization of EC ef-fects appears only when participants are aware of the contingencies, but we suggest that this is because (a) the contingencies typically used in these studies are salient and easy to detect consciously, and (b) the performance-based measures of awareness (so-called “ob-jective measures”), typically used in these studies, tend to overestimate the amount of available conscious knowledge. We report a preregistered study in which participants (N = 217) were exposed to letter strings generated from two complex artificial grammars that are difficult to decipher consciously. Stimuli from one grammar were paired with positive USs, while those from the other grammar were paired with negative USs. Subsequently, partici-pants evaluated new, previously-unseen, stimuli from the positively-conditioned grammar more positively than new stimuli from the negatively-conditioned grammar. Importantly, this effect appeared even when trial-by-trial subjective measures indicated lack of relevant conscious knowledge. We provide evidence for the generalization of EC effects even with-out subjective awareness of the structures that enable those generalizations
Interaction Grammars
Interaction Grammar (IG) is a grammatical formalism based on the notion of
polarity. Polarities express the resource sensitivity of natural languages by
modelling the distinction between saturated and unsaturated syntactic
structures. Syntactic composition is represented as a chemical reaction guided
by the saturation of polarities. It is expressed in a model-theoretic framework
where grammars are constraint systems using the notion of tree description and
parsing appears as a process of building tree description models satisfying
criteria of saturation and minimality
Structure preserving transformations on non-left-recursive grammars
We will be concerned with grammar covers, The first part of this paper presents a general framework for covers. The second part introduces a transformation from nonleft-recursive grammars to grammars in Greibach normal form. An investigation of the structure preserving properties of this transformation, which serves also as an illustration of our framework for covers, is presented
- …