论文标题
将动态语义纳入基于方面情感分析的预训练语言模型
Incorporating Dynamic Semantics into Pre-Trained Language Model for Aspect-based Sentiment Analysis
论文作者
论文摘要
基于方面的情感分析(ABSA)预测了给定句子中特定方面的情感极性。尽管伯特(Bert)等预训练的语言模型取得了巨大的成功,但将动态语义变化纳入ABSA仍然具有挑战性。为此,在本文中,我们建议通过动态重新加权Bert(Dr-Bert)来解决这个问题,这是一种旨在学习ABSA的动态方面语义的新颖方法。具体而言,我们首先将堆栈式层作为主要编码器,以掌握句子的整体语义,然后通过合并轻巧的动态重新加权适配器(DRA)来微调它。请注意,DRA可以密切注意每个步骤的一小部分句子区域,并重新提出至关重要的单词,以获得更好的方面感性理解。最后,三个基准数据集的实验结果证明了我们提出的模型的有效性和合理性,并为未来的语义建模提供了良好的可解释见解。
Aspect-based sentiment analysis (ABSA) predicts sentiment polarity towards a specific aspect in the given sentence. While pre-trained language models such as BERT have achieved great success, incorporating dynamic semantic changes into ABSA remains challenging. To this end, in this paper, we propose to address this problem by Dynamic Re-weighting BERT (DR-BERT), a novel method designed to learn dynamic aspect-oriented semantics for ABSA. Specifically, we first take the Stack-BERT layers as a primary encoder to grasp the overall semantic of the sentence and then fine-tune it by incorporating a lightweight Dynamic Re-weighting Adapter (DRA). Note that the DRA can pay close attention to a small region of the sentences at each step and re-weigh the vitally important words for better aspect-aware sentiment understanding. Finally, experimental results on three benchmark datasets demonstrate the effectiveness and the rationality of our proposed model and provide good interpretable insights for future semantic modeling.