论文标题

斗争:情感知识增强了情感分析的预培训

SKEP: Sentiment Knowledge Enhanced Pre-training for Sentiment Analysis

论文作者

Tian, Hao, Gao, Can, Xiao, Xinyan, Liu, Hao, He, Bolei, Wu, Hua, Wang, Haifeng, Wu, Feng

论文摘要

最近,情感分析在培训前方法的帮助下取得了惊人的进步。然而,尽管事实在传统的情感分析方法中广泛使用了这些情感知识,例如情感词和方面态度对,在预训练过程中被忽略。在本文中,我们介绍了情感知识增强的预训练(SKEP),以便学习多个情感分析任务的统一情感表示。借助自动融合的知识,Skep进行情感掩盖并构建了三个情感知识预测目标,以便将情感信息以单词,极性和方面层面嵌入预先训练的情感表示中。特别是,对方面对对的预测将转换为多标签分类,旨在捕获一对单词之间的依赖性。在三种情感任务上进行的实验表明,SKEP明显优于强大的预训练基线,并在大多数测试数据集中实现了新的最新结果。我们在https://github.com/baidu/senta上发布代码。

Recently, sentiment analysis has seen remarkable advance with the help of pre-training approaches. However, sentiment knowledge, such as sentiment words and aspect-sentiment pairs, is ignored in the process of pre-training, despite the fact that they are widely used in traditional sentiment analysis approaches. In this paper, we introduce Sentiment Knowledge Enhanced Pre-training (SKEP) in order to learn a unified sentiment representation for multiple sentiment analysis tasks. With the help of automatically-mined knowledge, SKEP conducts sentiment masking and constructs three sentiment knowledge prediction objectives, so as to embed sentiment information at the word, polarity and aspect level into pre-trained sentiment representation. In particular, the prediction of aspect-sentiment pairs is converted into multi-label classification, aiming to capture the dependency between words in a pair. Experiments on three kinds of sentiment tasks show that SKEP significantly outperforms strong pre-training baseline, and achieves new state-of-the-art results on most of the test datasets. We release our code at https://github.com/baidu/Senta.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源