Learning quickly and continually is still an ambitious task for neural
networks. Indeed, many real-world applications do not reflect the learning
setting where neural networks shine, as data are usually few, mostly unlabelled
and come as a stream. To narrow this gap, we introduce FUSION - Few-shot
UnSupervIsed cONtinual learning - a novel strategy which aims to deal with
neural networks that "learn in the wild", simulating a real distribution and
flow of unbalanced tasks. We equip FUSION with MEML - Meta-Example
Meta-Learning - a new module that simultaneously alleviates catastrophic
forgetting and favours the generalisation and future learning of new tasks. To
encourage features reuse during the meta-optimisation, our model exploits a
single inner loop per task, taking advantage of an aggregated representation
achieved through the use of a self-attention mechanism. To further enhance the
generalisation capability of MEML, we extend it by adopting a technique that
creates various augmented tasks and optimises over the hardest. Experimental
results on few-shot learning benchmarks show that our model exceeds the other
baselines in both FUSION and fully supervised case. We also explore how it
behaves in standard continual learning consistently outperforming
state-of-the-art approaches.Comment: 16 pages, 11 figures, 13 tables. arXiv admin note: substantial text
overlap with arXiv:2009.0810