Few-shot learning (FSL) aims to train a strong classifier using limited
labeled examples. Many existing works take the meta-learning approach, sampling
few-shot tasks in turn and optimizing the few-shot learner's performance on
classifying the query examples. In this paper, we point out two potential
weaknesses of this approach. First, the sampled query examples may not provide
sufficient supervision for the few-shot learner. Second, the effectiveness of
meta-learning diminishes sharply with increasing shots (i.e., the number of
training examples per class). To resolve these issues, we propose a novel
objective to directly train the few-shot learner to perform like a strong
classifier. Concretely, we associate each sampled few-shot task with a strong
classifier, which is learned with ample labeled examples. The strong classifier
has a better generalization ability and we use it to supervise the few-shot
learner. We present an efficient way to construct the strong classifier, making
our proposed objective an easily plug-and-play term to existing meta-learning
based FSL methods. We validate our approach in combinations with many
representative meta-learning methods. On several benchmark datasets including
miniImageNet and tiredImageNet, our approach leads to a notable improvement
across a variety of tasks. More importantly, with our approach, meta-learning
based FSL methods can consistently outperform non-meta-learning based ones,
even in a many-shot setting, greatly strengthening their applicability