论文标题
探索时间依赖性解析的上下文化神经语言模型
Exploring Contextualized Neural Language Models for Temporal Dependency Parsing
论文作者
论文摘要
在事件和时间表之间提取时间关系有许多应用程序,例如构建事件时间表和与时间相关的问题回答。这是一个具有挑战性的问题,需要在句子或话语级别上进行句法和语义信息,这可能会被深层上下文化的语言模型(LMS)(例如Bert)捕获(Devlin等,2019)。在本文中,我们开发了基于BERT的时间依赖性解析器的几种变体,并表明BERT显着改善了时间依赖性解析(Zhang and Xue,2018a)。我们还提出了详细的分析,内容涉及为什么深层背景化的神经LMS有助于以及它们可能缺乏的地方。源代码和资源可在https://github.com/bnmin/tdp_ranking上找到。
Extracting temporal relations between events and time expressions has many applications such as constructing event timelines and time-related question answering. It is a challenging problem which requires syntactic and semantic information at sentence or discourse levels, which may be captured by deep contextualized language models (LMs) such as BERT (Devlin et al., 2019). In this paper, we develop several variants of BERT-based temporal dependency parser, and show that BERT significantly improves temporal dependency parsing (Zhang and Xue, 2018a). We also present a detailed analysis on why deep contextualized neural LMs help and where they may fall short. Source code and resources are made available at https://github.com/bnmin/tdp_ranking.