Embrace concept drift : a novel solution for online continual learning

Abstract

Continual learning is a critical area of research in machine learning that aims to enable models to learn new information without forgetting the old knowledge. Online continual learning, in particular, addresses the challenges of learning from a stream of data in real-world environments where data can be unbounded and heterogeneous. There are two main problems to be addressed in online continual learning: the first one is catastrophic forgetting, a phenomenon where the model forgets the previously learned knowledge while learning new tasks; the second one is concept drift, a situation where the distribution of the data changes over time. These issues can further complicate the learning process, compared to traditional machine learning. In this thesis, we propose a general framework for online continual learning that leverages both regularization-based and memory-based methods to mitigate catastrophic forgetting and handle concept drift. Specifically, we introduce a novel concept drift detection algorithm based on the confidence values of the samples. We present a novel online continual learning paradigm, which utilizes concept drift as a rehearsal signal to improve performance by consolidating or expanding the memory center. We also apply data condensation approaches to online continual learning in order to perform memory efficient rehearsal. Furthermore, we evaluate the accuracy of old tasks and new tasks, comparing with many benchmark models. We present a novel evaluation metric - Stability and Plasticity Balance to measure the balance between old and new accuracy. We evaluate our proposed approach on a new benchmark dataset framework, Continual Online Learning (COnL), which consists of two scenarios of online continual learning: class-incremental learning and instance-incremental learning. In this thesis, the benchmark dataset framework randomly selects a number of incremental classes from 3 different datasets: TinyImageNet, Germany Traffic Sign and Landmarks. Our primary results demonstrate that concept drift can be a useful tool in memory rehearsal in the online continual learning setting. Our proposed approaches provide a promising direction for future research in online continual learning and have the potential to enable models to learn continuously from unbounded and heterogeneous data streams in real-world environments

Similar works

Full text

thumbnail-image

ROS: The Research Output Service. Heriot-Watt University Edinburgh

redirect
Last time updated on 15/07/2024

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.