18 research outputs found

    SHOE:The extraction of hierarchical structure for machine learning of natural language

    Get PDF

    Default inheritance in an object-oriented representation of linguistic categories

    Get PDF

    SHOE: The extraction of hierarchical structure for machine learning of natural language

    No full text

    A distributed, yet symbolic model of text-to-speech processing

    No full text

    Implicit schemata and categories in memory-based language processing

    No full text

    IGTree: Using trees for compression and classification in lazy learning algorithms

    No full text
    We describe the IGTree learning algorithm, which compresses an instance base into a tree structure. The concept of information gain is used as a heuristic function for performing this compression. IGTree produces trees that, compared to other lazy learning approaches, reduce storage requirements and the time required to compute classifications. Furthermore, we obtained similar or better generalization accuracy with IGTree when trained on two complex linguistic tasks, viz. letter–phoneme transliteration and part-of-speech-tagging, when compared to alternative lazy learning and decision tree approaches (viz., IB1, information-gain-weighted IB1, and C4.5). A third experiment, with the task of word hyphenation, demonstrates that when the mutual differences in information gain of features is too small, IGTree as well as information-gain-weighted IB1 perform worse than IB1. These results indicate that IGTree is a useful algorithm for problems characterized by the availability of a large number of training instances described by symbolic features with sufficiently differing information gain values
    corecore