53,362 research outputs found

    k-NN Boosting Prototype Learning for Object Classification

    Get PDF
    Object classification is a challenging task in computer vision. Many approaches have been proposed to extract meaningful descriptors from images and classifying them in a supervised learning framework. In this paper, we revisit the classic k-nearest neighbors (k-NN) classification rule, which has shown to be very effective when dealing with local image descriptors. However, k-NN still features some major drawbacks, mainly due to the uniform voting among the nearest prototypes in the feature space. In this paper, we propose a generalization of the classic k-NN rule in a supervised learning (boosting) framework. Namely, we redefine the voting rule as a strong classifier that linearly combines predictions from the k closest prototypes. To induce this classifier, we propose a novel learning algorithm, MLNN (Multiclass Leveraged Nearest Neighbors), which gives a simple procedure for performing prototype selection very efficiently. We tested our method on 12 categories of objects, and observed significant improvement over classic k-NN in terms of classification performances

    Generalization properties of finite size polynomial Support Vector Machines

    Full text link
    The learning properties of finite size polynomial Support Vector Machines are analyzed in the case of realizable classification tasks. The normalization of the high order features acts as a squeezing factor, introducing a strong anisotropy in the patterns distribution in feature space. As a function of the training set size, the corresponding generalization error presents a crossover, more or less abrupt depending on the distribution's anisotropy and on the task to be learned, between a fast-decreasing and a slowly decreasing regime. This behaviour corresponds to the stepwise decrease found by Dietrich et al.[Phys. Rev. Lett. 82 (1999) 2975-2978] in the thermodynamic limit. The theoretical results are in excellent agreement with the numerical simulations.Comment: 12 pages, 7 figure

    Forgetting Exceptions is Harmful in Language Learning

    Get PDF
    We show that in language learning, contrary to received wisdom, keeping exceptional training instances in memory can be beneficial for generalization accuracy. We investigate this phenomenon empirically on a selection of benchmark natural language processing tasks: grapheme-to-phoneme conversion, part-of-speech tagging, prepositional-phrase attachment, and base noun phrase chunking. In a first series of experiments we combine memory-based learning with training set editing techniques, in which instances are edited based on their typicality and class prediction strength. Results show that editing exceptional instances (with low typicality or low class prediction strength) tends to harm generalization accuracy. In a second series of experiments we compare memory-based learning and decision-tree learning methods on the same selection of tasks, and find that decision-tree learning often performs worse than memory-based learning. Moreover, the decrease in performance can be linked to the degree of abstraction from exceptions (i.e., pruning or eagerness). We provide explanations for both results in terms of the properties of the natural language processing tasks and the learning algorithms.Comment: 31 pages, 7 figures, 10 tables. uses 11pt, fullname, a4wide tex styles. Pre-print version of article to appear in Machine Learning 11:1-3, Special Issue on Natural Language Learning. Figures on page 22 slightly compressed to avoid page overloa

    Simulated annealing based pattern classification

    Get PDF
    A method is described for finding decision boundaries, approximated by piecewise linear segments, for classifying patterns in RN, N ≥2, using Simulated Annealing (SA). It involves generation and placement of a set of hyperplanes (represented by strings) in the feature space that yields minimum misclassification. Theoretical analysis shows that as the size of the training data set approaches infinity, the boundary provided by the SA based classifier will approach the Bayes boundary. The effectiveness of the classification methodology, along with the generalization ability of the decision boundary, is demonstrated for both artificial data and real life data sets having non-linear/overlapping class boundaries. Results are compared extensively with those of the Bayes classifier, k-NN rule and multilayer perceptron, and Genetic Algorithms, another popular evolutionary technique. Empirical verification of the theoretical claim is also provided

    Generalized Matrix Mechanics

    Get PDF
    We propose a generalization of Heisenbergs' matrix mechanics based on many-index objects. It is shown that there exists a solution describing a harmonic oscillator and many-index objects lead to a generalization of spin algebra.Comment: Latex, 16pages, several corrections in 2.2 and 2.3 and corrected some typo

    Classical and Quantum Nambu Mechanics

    Full text link
    The classical and quantum features of Nambu mechanics are analyzed and fundamental issues are resolved. The classical theory is reviewed and developed utilizing varied examples. The quantum theory is discussed in a parallel presentation, and illustrated with detailed specific cases. Quantization is carried out with standard Hilbert space methods. With the proper physical interpretation, obtained by allowing for different time scales on different invariant sectors of a theory, the resulting non-Abelian approach to quantum Nambu mechanics is shown to be fully consistent.Comment: 44 pages, 1 figure, 1 table Minor changes to conform to journal versio
    • …
    corecore