Analysis of the impact of class ordering in class
incremental image classification
- Publication date
- Publisher
Abstract
The benefits of incremental learning make it desirable for many real-world applications. It enables efficient utilization of resources by eliminating the need to start training from scratch
when the considered set of tasks is updated. Additionally, it reduces memory usage, which is
particularly important in situations where privacy limitations exist, such as in the healthcare
sector where storing patient data for a long time is prohibited. However, the main challenge
of incremental learning is catastrophic forgetting, which causes a decline in the performance
of previously learned tasks after learning a new one. To overcome this challenge, various incremental learning methods have been proposed.
In this work, we explore the influence of class ordering on class incremental learning and the resilience of the method to different class orderings. Additionally, we examine how the complexity of incremental learning scenarios or task split strategies affects the model’s performance. We
start with a pre-existing approach and then introduce extensions to improve its performance.
Experimental results show that the model’s performance is not too significantly impacted by
the sequence in which classes are presented, but the complexity of the incremental tasks plays
a crucial role in determining the model’s performance. Additionally, starting with a higher
number of classes typically results in better performance.The benefits of incremental learning make it desirable for many real-world applications. It enables efficient utilization of resources by eliminating the need to start training from scratch
when the considered set of tasks is updated. Additionally, it reduces memory usage, which is
particularly important in situations where privacy limitations exist, such as in the healthcare
sector where storing patient data for a long time is prohibited. However, the main challenge
of incremental learning is catastrophic forgetting, which causes a decline in the performance
of previously learned tasks after learning a new one. To overcome this challenge, various incremental learning methods have been proposed.
In this work, we explore the influence of class ordering on class incremental learning and the resilience of the method to different class orderings. Additionally, we examine how the complexity of incremental learning scenarios or task split strategies affects the model’s performance. We
start with a pre-existing approach and then introduce extensions to improve its performance.
Experimental results show that the model’s performance is not too significantly impacted by
the sequence in which classes are presented, but the complexity of the incremental tasks plays
a crucial role in determining the model’s performance. Additionally, starting with a higher
number of classes typically results in better performance