论文标题
自我注意力不完整的话语重写
Self-Attention for Incomplete Utterance Rewriting
论文作者
论文摘要
不完整的话语重写(IUR)最近已成为NLP的重要任务,旨在用足够的上下文信息来补充不完整的话语以进行理解。在本文中,我们通过直接从变形金刚的自我发作权重矩阵而不是单词嵌入并相应地编辑原始文本以生成完整的话语来提出一种新方法。我们的方法从自我发挥的重量矩阵中的丰富信息中受益,在公共IUR数据集中取得了竞争成果。
Incomplete utterance rewriting (IUR) has recently become an essential task in NLP, aiming to complement the incomplete utterance with sufficient context information for comprehension. In this paper, we propose a novel method by directly extracting the coreference and omission relationship from the self-attention weight matrix of the transformer instead of word embeddings and edit the original text accordingly to generate the complete utterance. Benefiting from the rich information in the self-attention weight matrix, our method achieved competitive results on public IUR datasets.