Skip to main content
Article thumbnail
Location of Repository

Transformation-Based Learning in the Fast Lane

By Grace Ngai and Radu Florian

Abstract

Transformation-based learning has been successfully employed to solve many natural language processing problems. It achieves state-of-the-art performance on many natural language processing tasks and does not overtrain easily. However, it does have a serious drawback: the training time is often intorelably long, especially on the large corpora which are often used in NLP. In this paper, we present a novel and realistic method for speeding up the training time of a transformation-based learner without sacrificing performance. The paper compares and contrasts the training time needed and performance achieved by our modified learner with two other systems: a standard transformation-based learner, and the ICA system (Hepple, 2000). The results of these experiments show that our system is able to achieve a significant improvement in training time while still achieving the same performance as a stan- dard transformation-based learner. This is a valu- able contribution to systems and algorithms which utilize transformation-based learning at any part of the execution

Year: 2001
OAI identifier: oai:CiteSeerX.psu:10.1.1.19.9194
Provided by: CiteSeerX
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • http://citeseerx.ist.psu.edu/v... (external link)
  • http://acl.ldc.upenn.edu/N/N01... (external link)
  • Suggested articles


    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.