2 research outputs found
Dual-Balancing for Multi-Task Learning
Multi-task learning (MTL), a learning paradigm to learn multiple related
tasks simultaneously, has achieved great success in various fields. However,
task balancing problem remains a significant challenge in MTL, with the
disparity in loss/gradient scales often leading to performance compromises. In
this paper, we propose a Dual-Balancing Multi-Task Learning (DB-MTL) method to
alleviate the task balancing problem from both loss and gradient perspectives.
Specifically, DB-MTL ensures loss-scale balancing by performing a logarithm
transformation on each task loss, and guarantees gradient-magnitude balancing
via normalizing all task gradients to the same magnitude as the maximum
gradient norm. Extensive experiments conducted on several benchmark datasets
consistently demonstrate the state-of-the-art performance of DB-MTL.Comment: Technical Repor