Task Compass: Scaling Multi-task Pre-training with Task Prefix
- Zhuosheng Zhang ,
- Shuohang Wang ,
- Yichong Xu ,
- Yuwei Fang ,
- Wenhao Yu ,
- Yang Liu ,
- Hai Zhao ,
- Chenguang Zhu ,
- Michael Zeng
EMNLP 2022 |
Leveraging task-aware annotated data as supervised signals to assist with self-supervised learning on large-scale unlabeled data has become a new trend in pre-training language models. Existing studies show that multi- task learning with large-scale supervised tasks suffers from negative effects across tasks. To tackle the challenge, we propose a task prefix guided multi-task pre-training framework to explore the relationships among tasks. We conduct extensive experiments on 40 datasets, which show that our model can not only serve as the strong foundation backbone for a wide range of tasks but also be feasible as a probing tool for analyzing task relationships. The task relationships reflected by the prefixes align transfer learning performance between tasks. They also suggest directions for data augmentation with complementary tasks, which help our model achieve human-parity results on commonsense reasoning leaderboards. Code is available at https://github.com/cooelf/CompassMTL (opens in new tab).