论文标题

K-12bert:Bert for K-12教育

K-12BERT: BERT for K-12 education

论文作者

Goel, Vasu, Sahnan, Dhruv, V, Venktesh, Sharma, Gaurav, Dwivedi, Deep, Mohania, Mukesh

论文摘要

在线教育平台由各种NLP管道提供动力,该管道利用BERT等模型来帮助内容策划。自从伯特(Bert)等预先训练的语言模型成立以来,还为将这些预训练的模型调整为特定领域而做出了许多努力。但是,据我们所知,没有一个专门针对教育领域(尤其是K-12)的模型。在这项工作中,我们建议对我们在K-12教育的各个来源的多个受试者策划的数据语料库培训语言模型。我们还评估了我们的模型K12-Bert,以等级分类标记等下游任务。

Online education platforms are powered by various NLP pipelines, which utilize models like BERT to aid in content curation. Since the inception of the pre-trained language models like BERT, there have also been many efforts toward adapting these pre-trained models to specific domains. However, there has not been a model specifically adapted for the education domain (particularly K-12) across subjects to the best of our knowledge. In this work, we propose to train a language model on a corpus of data curated by us across multiple subjects from various sources for K-12 education. We also evaluate our model, K12-BERT, on downstream tasks like hierarchical taxonomy tagging.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源