4 research outputs found

    Learning from non-stationary data using a growing network of prototypes

    Get PDF
    Proceeding of: 2013 IEEE Congress on Evolutionary Computation (CEC), Cancun, 20-23 June 2013Learning from non-stationary data requires methods that are able to deal with a continuous stream of data instances, possibly of infinite size, where the class distributions are potentially drifting over time. For handling such datasets, we are proposing a new method that incrementally creates and adapts a network of prototypes for classifying complex data received in an online fashion. The algorithm includes both an accuracy-based and time-based forgetting mechanisms that ensure that the model size does not grow indefinitely with large datasets. We have performed tests on seven benchmarking datasets for comparing our proposal with several approaches found in the literature, including ensemble algorithms associated to two different base classifiers. Performances obtained show that our algorithm is comparable to the best of the ensemble classifiers in terms of accuracy/time trade-off. Moreover, our approach appears to have significant advantages for dealing with data that has a complex, non-linearly separable topology.Spanish Ministry of Science and Innovation under the project MOVES, grant reference TIN2011-28336, and NSERC-CanadaThis article has been funded by the Spanish Ministry of Science and Innovation under the project MOVES with grant reference TIN2011-28336, and NSERC-Canada.Publicad

    A New Approach Adapting Neural Network Classifiers to Sudden Changes in Nonstationary Environments

    Get PDF
    Business are increasingly analyzing streaming data in real time to achieve business objectives such as monetization or quality control. The predictive algorithms applied to streaming data sources are often trained sequentially by updating the model weights after each new data point arrives. When disruptions or changes in the data generating process occur, the online learning process allows the algorithm to slowly learn the changes; however, there may be a period of time after concept drift during which the predictive algorithm underperforms. This thesis introduces a method that makes online neural network classifiers more resilient to these concept drifts by utilizing data about concept drift to update neural network parameters
    corecore