论文标题

人类句子处理:复发还是关注?

Human Sentence Processing: Recurrence or Attention?

论文作者

Merkx, Danny, Frank, Stefan L.

论文摘要

长期以来,经常性的神经网络(RNN)一直是人类句子处理计算模型感兴趣的架构。最近引入的变压器体系结构在许多自然语言处理任务上的表现优于RNN,但对其对人类语言处理的能力知之甚少。我们比较了基于变形金刚和RNN的语言模型的能力来考虑人类阅读工作的衡量标准。我们的分析表明,在阅读英语句子期间,变形金刚在解释自定进度的阅读时间和神经活动方面表现出色,挑战了人类句子处理涉及经常性和即时处理的广泛观念,并为基于CUE的检索提供了证据。

Recurrent neural networks (RNNs) have long been an architecture of interest for computational models of human sentence processing. The recently introduced Transformer architecture outperforms RNNs on many natural language processing tasks but little is known about its ability to model human language processing. We compare Transformer- and RNN-based language models' ability to account for measures of human reading effort. Our analysis shows Transformers to outperform RNNs in explaining self-paced reading times and neural activity during reading English sentences, challenging the widely held idea that human sentence processing involves recurrent and immediate processing and provides evidence for cue-based retrieval.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源