1 research outputs found
Meta-Learning with a Geometry-Adaptive Preconditioner
Model-agnostic meta-learning (MAML) is one of the most successful
meta-learning algorithms. It has a bi-level optimization structure where the
outer-loop process learns a shared initialization and the inner-loop process
optimizes task-specific weights. Although MAML relies on the standard gradient
descent in the inner-loop, recent studies have shown that controlling the
inner-loop's gradient descent with a meta-learned preconditioner can be
beneficial. Existing preconditioners, however, cannot simultaneously adapt in a
task-specific and path-dependent way. Additionally, they do not satisfy the
Riemannian metric condition, which can enable the steepest descent learning
with preconditioned gradient. In this study, we propose Geometry-Adaptive
Preconditioned gradient descent (GAP) that can overcome the limitations in
MAML; GAP can efficiently meta-learn a preconditioner that is dependent on
task-specific parameters, and its preconditioner can be shown to be a
Riemannian metric. Thanks to the two properties, the geometry-adaptive
preconditioner is effective for improving the inner-loop optimization.
Experiment results show that GAP outperforms the state-of-the-art MAML family
and preconditioned gradient descent-MAML (PGD-MAML) family in a variety of
few-shot learning tasks. Code is available at:
https://github.com/Suhyun777/CVPR23-GAP.Comment: Accepted at CVPR 2023. Code is available at:
https://github.com/Suhyun777/CVPR23-GAP; This is an extended version of our
previous CVPR23 wor