论文标题
自然语言处理中多任务学习的调查:关于任务相关性和培训方法
A Survey of Multi-task Learning in Natural Language Processing: Regarding Task Relatedness and Training Methods
论文作者
论文摘要
多任务学习(MTL)在自然语言处理(NLP)中变得越来越流行,因为它通过利用其共同点和差异来提高相关任务的性能。然而,仍然不太了解如何根据培训任务的相关性来实现多任务学习。在这项调查中,我们回顾了NLP中多任务学习方法的最新进展,目的是将它们基于任务相关性总结为两种一般的多任务培训方法:(i)联合培训和(ii)多步培训。我们在各种NLP下游应用程序中介绍了示例,总结了任务关系,并讨论了这个有前途的主题的未来方向。
Multi-task learning (MTL) has become increasingly popular in natural language processing (NLP) because it improves the performance of related tasks by exploiting their commonalities and differences. Nevertheless, it is still not understood very well how multi-task learning can be implemented based on the relatedness of training tasks. In this survey, we review recent advances of multi-task learning methods in NLP, with the aim of summarizing them into two general multi-task training methods based on their task relatedness: (i) joint training and (ii) multi-step training. We present examples in various NLP downstream applications, summarize the task relationships and discuss future directions of this promising topic.