1 research outputs found

    Distance measures for prototype based classification

    No full text
    Biehl M, Hammer B, Villmann T. Distance Measures for Prototype Based Classification. In: Grandinetti L, Lippert T, Petkov N, eds. Brain-Inspired Computing. International Workshop, BrainComp 2013, Cetraro, Italy, July 8-11, 2013, Revised Selected Papers. Lecture Notes in Computer Science. Cham: Springer International Publishing; 2014: 100-116.The basic concepts of distance based classification are introduced in terms of clear-cut example systems. The classical k-Nearest-Neigbhor (kNN) classifier serves as the starting point of the discussion. Learning Vector Quantization (LVQ) is introduced, which represents the reference data by a few prototypes. This requires a data driven training process; examples of heuristic and cost function based prescriptions are presented. While the most popular measure of dissimilarity in this context is the Euclidean distance, this choice is frequently made without justification. Alternative distances can yield better performance in practical problems. Several examples are discussed, including more general Minkowski metrics and statistical divergences for the comparison of, e.g., histogram data. Furthermore, the framework of relevance learning in LVQ is presented. There, parameters of adaptive distance measures are optimized in the training phase. A practical application of Matrix Relevance LVQ in the context of tumor classification illustrates the approach
    corecore