论文标题

基于两级残留蒸馏的三重网络,用于增量对象检测

Two-Level Residual Distillation based Triple Network for Incremental Object Detection

论文作者

Yang, Dongbao, Zhou, Yu, Wu, Dayan, Ma, Can, Yang, Fei, Wang, Weiping

论文摘要

基于卷积神经网络的现代物体检测方法在学习新课程的情况下,没有原始数据,遭受了严重的灾难性遗忘。由于时间消耗,旧数据的存储负担和隐私,因此在模型训练后新对象类别出现时,不可能从头开始训练模型。在本文中,我们提出了一个基于更快的R-CNN的新型增量对象检测器,以在不使用旧数据的情况下从新对象类中持续学习。这是一个三重网络,旧模型和残留模型是帮助新课程中的增量模型学习的助手,而无需忘记以前的学习知识。为了更好地维持对旧类和新班级之间特征的歧视,剩余模型是在增量学习过程中共同培训的。此外,设计的蒸馏方案旨在指导训练过程,该过程包括两级残留蒸馏损失和联合分类蒸馏损失。进行了有关VOC2007和可可的广泛实验,结果表明,所提出的方法可以有效地学习逐步检测新类的对象,并且在这种情况下会减轻灾难性遗忘的问题。

Modern object detection methods based on convolutional neural network suffer from severe catastrophic forgetting in learning new classes without original data. Due to time consumption, storage burden and privacy of old data, it is inadvisable to train the model from scratch with both old and new data when new object classes emerge after the model trained. In this paper, we propose a novel incremental object detector based on Faster R-CNN to continuously learn from new object classes without using old data. It is a triple network where an old model and a residual model as assistants for helping the incremental model learning on new classes without forgetting the previous learned knowledge. To better maintain the discrimination of features between old and new classes, the residual model is jointly trained on new classes in the incremental learning procedure. In addition, a corresponding distillation scheme is designed to guide the training process, which consists of a two-level residual distillation loss and a joint classification distillation loss. Extensive experiments on VOC2007 and COCO are conducted, and the results demonstrate that the proposed method can effectively learn to incrementally detect objects of new classes, and the problem of catastrophic forgetting is mitigated in this context.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源