论文标题

ATP:一种整体关注的综合方法来增强ABSA

ATP: A holistic attention integrated approach to enhance ABSA

论文作者

Kumar, Ashish, Dahiya, Vasundhra, Sharan, Aditi

论文摘要

基于方面的情感分析(ABSA)涉及对特定方面的评论句子的情感极性的识别。 RNN,LSTM和GRU等深度学习顺序模型是推断情感极性的当前最新方法。这些方法可以很好地捕获评论句子的单词之间的上下文关系。但是,这些方法在捕获长期依赖性方面微不足道。注意机制仅专注于句子的最关键部分,从而发挥着重要作用。在ABSA的情况下,方面位置起着至关重要的作用。在确定对该方面的情绪的同时,几乎差不多的单词会做出更多的贡献。因此,我们提出了一种使用依赖解析树捕获基于位置信息的方法,并有助于注意机制。使用这种类型的位置信息通过简单的基于单词距离的位置增强了深度学习模型的性能。我们在Semeval'14数据集上进行了实验,以证明基于ABSA的基于依赖性解析关系的效果。

Aspect based sentiment analysis (ABSA) deals with the identification of the sentiment polarity of a review sentence towards a given aspect. Deep Learning sequential models like RNN, LSTM, and GRU are current state-of-the-art methods for inferring the sentiment polarity. These methods work well to capture the contextual relationship between the words of a review sentence. However, these methods are insignificant in capturing long-term dependencies. Attention mechanism plays a significant role by focusing only on the most crucial part of the sentence. In the case of ABSA, aspect position plays a vital role. Words near to aspect contribute more while determining the sentiment towards the aspect. Therefore, we propose a method that captures the position based information using dependency parsing tree and helps attention mechanism. Using this type of position information over a simple word-distance-based position enhances the deep learning model's performance. We performed the experiments on SemEval'14 dataset to demonstrate the effect of dependency parsing relation-based attention for ABSA.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源