6,069 research outputs found
Implicit learning of recursive context-free grammars
Context-free grammars are fundamental for the description of linguistic syntax. However, most artificial grammar learning
experiments have explored learning of simpler finite-state grammars, while studies exploring context-free grammars have
not assessed awareness and implicitness. This paper explores the implicit learning of context-free grammars employing
features of hierarchical organization, recursive embedding and long-distance dependencies. The grammars also featured
the distinction between left- and right-branching structures, as well as between centre- and tail-embedding, both
distinctions found in natural languages. People acquired unconscious knowledge of relations between grammatical classes
even for dependencies over long distances, in ways that went beyond learning simpler relations (e.g. n-grams) between
individual words. The structural distinctions drawn from linguistics also proved important as performance was greater for
tail-embedding than centre-embedding structures. The results suggest the plausibility of implicit learning of complex
context-free structures, which model some features of natural languages. They support the relevance of artificial grammar
learning for probing mechanisms of language learning and challenge existing theories and computational models of
implicit learning
Robust Grammatical Analysis for Spoken Dialogue Systems
We argue that grammatical analysis is a viable alternative to concept
spotting for processing spoken input in a practical spoken dialogue system. We
discuss the structure of the grammar, and a model for robust parsing which
combines linguistic sources of information and statistical sources of
information. We discuss test results suggesting that grammatical processing
allows fast and accurate processing of spoken input.Comment: Accepted for JNL
CHR Grammars
A grammar formalism based upon CHR is proposed analogously to the way
Definite Clause Grammars are defined and implemented on top of Prolog. These
grammars execute as robust bottom-up parsers with an inherent treatment of
ambiguity and a high flexibility to model various linguistic phenomena. The
formalism extends previous logic programming based grammars with a form of
context-sensitive rules and the possibility to include extra-grammatical
hypotheses in both head and body of grammar rules. Among the applications are
straightforward implementations of Assumption Grammars and abduction under
integrity constraints for language analysis. CHR grammars appear as a powerful
tool for specification and implementation of language processors and may be
proposed as a new standard for bottom-up grammars in logic programming.
To appear in Theory and Practice of Logic Programming (TPLP), 2005Comment: 36 pp. To appear in TPLP, 200
Pushdown Compression
The pressing need for eficient compression schemes for XML documents has
recently been focused on stack computation [6, 9], and in particular calls for
a formulation of information-lossless stack or pushdown compressors that allows
a formal analysis of their performance and a more ambitious use of the stack in
XML compression, where so far it is mainly connected to parsing mechanisms. In
this paper we introduce the model of pushdown compressor, based on pushdown
transducers that compute a single injective function while keeping the widest
generality regarding stack computation. The celebrated Lempel-Ziv algorithm
LZ78 [10] was introduced as a general purpose compression algorithm that
outperforms finite-state compressors on all sequences. We compare the
performance of the Lempel-Ziv algorithm with that of the pushdown compressors,
or compression algorithms that can be implemented with a pushdown transducer.
This comparison is made without any a priori assumption on the data's source
and considering the asymptotic compression ratio for infinite sequences. We
prove that Lempel-Ziv is incomparable with pushdown compressors
- …