Multilingual text recognition (MLTR) systems typically focus on a fixed set
of languages, which makes it difficult to handle newly added languages or adapt
to ever-changing data distribution. In this paper, we propose the Incremental
MLTR (IMLTR) task in the context of incremental learning (IL), where different
languages are introduced in batches. IMLTR is particularly challenging due to
rehearsal-imbalance, which refers to the uneven distribution of sample
characters in the rehearsal set, used to retain a small amount of old data as
past memories. To address this issue, we propose a Multiplexed Routing Network
(MRN). MRN trains a recognizer for each language that is currently seen.
Subsequently, a language domain predictor is learned based on the rehearsal set
to weigh the recognizers. Since the recognizers are derived from the original
data, MRN effectively reduces the reliance on older data and better fights
against catastrophic forgetting, the core issue in IL. We extensively evaluate
MRN on MLT17 and MLT19 datasets. It outperforms existing general-purpose IL
methods by large margins, with average accuracy improvements ranging from 10.3%
to 35.8% under different settings. Code is available at
https://github.com/simplify23/MRN.Comment: Accepted by ICCV 202