Multi-task learning (MTL) has been successfully used in many real-world
applications, which aims to simultaneously solve multiple tasks with a single
model. The general idea of multi-task learning is designing kinds of global
parameter sharing mechanism and task-specific feature extractor to improve the
performance of all tasks. However, challenge still remains in balancing the
trade-off of various tasks since model performance is sensitive to the
relationships between them. Less correlated or even conflict tasks will
deteriorate the performance by introducing unhelpful or negative information.
Therefore, it is important to efficiently exploit and learn fine-grained
feature representation corresponding to each task. In this paper, we propose an
Adaptive Pattern Extraction Multi-task (APEM) framework, which is adaptive and
flexible for large-scale industrial application. APEM is able to fully utilize
the feature information by learning the interactions between the input feature
fields and extracted corresponding tasks-specific information. We first
introduce a DeepAuto Group Transformer module to automatically and efficiently
enhance the feature expressivity with a modified set attention mechanism and a
Squeeze-and-Excitation operation. Second, explicit Pattern Selector is
introduced to further enable selectively feature representation learning by
adaptive task-indicator vectors. Empirical evaluations show that APEM
outperforms the state-of-the-art MTL methods on public and real-world financial
services datasets. More importantly, we explore the online performance of APEM
in a real industrial-level recommendation scenario.Comment: 18 pages, 9 figure