论文标题

从一些软标签原型中学习一些示例的新任务

Learning New Tasks from a Few Examples with Soft-Label Prototypes

论文作者

Singh, Avyav Kumar, Shutova, Ekaterina, Yannakoudakis, Helen

论文摘要

NLP中几次学习的现有方法依赖于大型语言模型(LLM)和/或对这些方法进行微调来推广分布数据。在这项工作中,我们提出了一种基于软标签原型(SLP)的新颖的几声学习方法,旨在集体捕获在输入域空间中不同类别的分布。我们专注于从几乎没有示例(4、8、16)中学习以前看不见的NLP任务,并在实验上证明,我们的方法在此数据leean设置中在大多数经过测试的任务上实现了卓越的性能,同时高度参数有效。我们还表明,我们的几次适应方法可以集成到更概括的学习环境中,主要是元学习,以对强大的基线产生卓越的性能。

Existing approaches to few-shot learning in NLP rely on large language models (LLMs) and/or fine-tuning of these to generalise on out-of-distribution data. In this work, we propose a novel few-shot learning approach based on soft-label prototypes (SLPs) designed to collectively capture the distribution of different classes across the input domain space. We focus on learning previously unseen NLP tasks from very few examples (4, 8, 16) per class and experimentally demonstrate that our approach achieves superior performance on the majority of tested tasks in this data-lean setting while being highly parameter efficient. We also show that our few-shot adaptation method can be integrated into more generalised learning settings, primarily meta-learning, to yield superior performance against strong baselines.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源