This paper focuses on an under-explored yet important problem: Federated
Class-Continual Learning (FCCL), where new classes are dynamically added in
federated learning. Existing FCCL works suffer from various limitations, such
as requiring additional datasets or storing the private data from previous
tasks. In response, we first demonstrate that non-IID data exacerbates
catastrophic forgetting issue in FL. Then we propose a novel method called
TARGET (federat\textbf{T}ed cl\textbf{A}ss-continual lea\textbf{R}nin\textbf{G}
via \textbf{E}xemplar-free dis\textbf{T}illation), which alleviates
catastrophic forgetting in FCCL while preserving client data privacy. Our
proposed method leverages the previously trained global model to transfer
knowledge of old tasks to the current task at the model level. Moreover, a
generator is trained to produce synthetic data to simulate the global
distribution of data on each client at the data level. Compared to previous
FCCL methods, TARGET does not require any additional datasets or storing real
data from previous tasks, which makes it ideal for data-sensitive scenarios.Comment: ICCV 202