2,760 research outputs found
Usage-based and emergentist approaches to language acquisition
It was long considered to be impossible to learn grammar based on linguistic experience alone. In the past decade, however, advances in usage-based linguistic theory, computational linguistics, and developmental psychology changed the view on this matter. So-called usage-based and emergentist approaches to language acquisition state that language can be learned from language use itself, by means of social skills like joint attention, and by means of powerful generalization mechanisms. This paper first summarizes the assumptions regarding the nature of linguistic representations and processing. Usage-based theories are nonmodular and nonreductionist, i.e., they emphasize the form-function relationships, and deal with all of language, not just selected levels of representations. Furthermore, storage and processing is considered to be analytic as well as holistic, such that there is a continuum between children's unanalyzed chunks and abstract units found in adult language. In the second part, the empirical evidence is reviewed. Children's linguistic competence is shown to be limited initially, and it is demonstrated how children can generalize knowledge based on direct and indirect positive evidence. It is argued that with these general learning mechanisms, the usage-based paradigm can be extended to multilingual language situations and to language acquisition under special circumstances
Parsing Corpus-Induced Type-Logical Grammars
International audienceType-logical grammars which have been automatically extracted from linguistic corpora provide parsers for these grammars with a considerable challenge. The size of the lexicon and the combinatory possibilities of the lexical entries both call for rethinking of the traditional type-logical parsing strategies. We show how methods from statistical natural language processing can be incorporated into a type-logical parser, give some preliminary data and sketch some new experiments we expect to produce better results
Natural language processing
Beginning with the basic issues of NLP, this chapter aims to chart the major research activities in this area since the last ARIST Chapter in 1996 (Haas, 1996), including: (i) natural language text processing systems - text summarization, information extraction, information retrieval, etc., including domain-specific applications; (ii) natural language interfaces; (iii) NLP in the context of www and digital libraries ; and (iv) evaluation of NLP systems
Effective weakly supervised semantic frame induction using expression sharing in hierarchical hidden Markov models
We present a framework for the induction of semantic frames from utterances
in the context of an adaptive command-and-control interface. The system is
trained on an individual user's utterances and the corresponding semantic
frames representing controls. During training, no prior information on the
alignment between utterance segments and frame slots and values is available.
In addition, semantic frames in the training data can contain information that
is not expressed in the utterances. To tackle this weakly supervised
classification task, we propose a framework based on Hidden Markov Models
(HMMs). Structural modifications, resulting in a hierarchical HMM, and an
extension called expression sharing are introduced to minimize the amount of
training time and effort required for the user.
The dataset used for the present study is PATCOR, which contains commands
uttered in the context of a vocally guided card game, Patience. Experiments
were carried out on orthographic and phonetic transcriptions of commands,
segmented on different levels of n-gram granularity. The experimental results
show positive effects of all the studied system extensions, with some effect
differences between the different input representations. Moreover, evaluation
experiments on held-out data with the optimal system configuration show that
the extended system is able to achieve high accuracies with relatively small
amounts of training data
- …