thesis

Large-scale connectionist natural language parsing using lexical semantic and syntactic knowledge

Abstract

Syntactic parsing plays a pivotal role in most automatic natural language processing systems. The research project presented in this dissertation has focused on two main characteristics of connectionist models for natural language processing: their adaptability to different tagging conventions, and their ability to use multiple linguistic constraints in parallel during sentence processing. In focusing on these key characteristics, an existing hybrid connectionist, shift-reduce corpus-based parsing model has been modified. This parser, which had earlier been trained to acquire linguistic knowledge from the Lancaster Parsed Corpus, has been adapted to learn linguistic knowledge from the Wall Street Journal Corpus. This adaptation is a novel demonstration that this connectionist parser, and by extension, other similar connectionist models, is able to adapt to more than one syntactic tagging convention; this implies their ability to adapt to the underlying linguistic theories used to annotate these corpora

    Similar works