96 research outputs found

    Implicitly Constrained Semi-Supervised Least Squares Classification

    Full text link
    We introduce a novel semi-supervised version of the least squares classifier. This implicitly constrained least squares (ICLS) classifier minimizes the squared loss on the labeled data among the set of parameters implied by all possible labelings of the unlabeled data. Unlike other discriminative semi-supervised methods, our approach does not introduce explicit additional assumptions into the objective function, but leverages implicit assumptions already present in the choice of the supervised least squares classifier. We show this approach can be formulated as a quadratic programming problem and its solution can be found using a simple gradient descent procedure. We prove that, in a certain way, our method never leads to performance worse than the supervised classifier. Experimental results corroborate this theoretical result in the multidimensional case on benchmark datasets, also in terms of the error rate.Comment: 12 pages, 2 figures, 1 table. The Fourteenth International Symposium on Intelligent Data Analysis (2015), Saint-Etienne, Franc

    XCS Classifier System with Experience Replay

    Full text link
    XCS constitutes the most deeply investigated classifier system today. It bears strong potentials and comes with inherent capabilities for mastering a variety of different learning tasks. Besides outstanding successes in various classification and regression tasks, XCS also proved very effective in certain multi-step environments from the domain of reinforcement learning. Especially in the latter domain, recent advances have been mainly driven by algorithms which model their policies based on deep neural networks -- among which the Deep-Q-Network (DQN) is a prominent representative. Experience Replay (ER) constitutes one of the crucial factors for the DQN's successes, since it facilitates stabilized training of the neural network-based Q-function approximators. Surprisingly, XCS barely takes advantage of similar mechanisms that leverage stored raw experiences encountered so far. To bridge this gap, this paper investigates the benefits of extending XCS with ER. On the one hand, we demonstrate that for single-step tasks ER bears massive potential for improvements in terms of sample efficiency. On the shady side, however, we reveal that the use of ER might further aggravate well-studied issues not yet solved for XCS when applied to sequential decision problems demanding for long-action-chains

    Online learning with nonlinear models

    Get PDF

    Improving classification models with context knowledge and variable activation functions

    Get PDF
    This work proposes two methods to boost the performances of a given classifier: the first one, which works on a Neural Network classifier, is a new type of trainable activation function, that is a function which is adjusted during the learning phase, allowing the network to exploit the data better respect to use a classic activation function with fixed-shape; the second one provides two frameworks to use an external knowledge base to improve the classification results

    Supervised Learning - An Introduction:Lectures given at the 30th Canary Islands Winter School of Astrophysics

    Get PDF
    Based on a set of lectures given at the 30th Canary Islands Winter School of Astrophysics: Big Data Analysis in Astronomy, La Laguna, Tenerife, Spain, 11/2018To a large extent, the material is taken from an MSc level course Neural Networks and Computational IntelligenceComputing Science Programme, University of GroningenThese notes present a selection of topics in the area of supervised machine learning. The focus is on the discussion of methods and algorithms for classification tasks. Regression by neural networks is discussedonly very briefly as it is in the center of complementary lectures. The same applies to concepts and methods of unsupervised learning.The selection and presentation of the material is clearly influenced by personal biasses and preferences. Nevertheless, the lectures and notes should provide a useful, albeit incomplete, overview and serve as astarting point for further exploration of the fascinating area of machine learning
    • …
    corecore