论文标题
强大的神经网络可以成为较弱的神经网络的老师吗?
Can a powerful neural network be a teacher for a weaker neural network?
论文作者
论文摘要
转移学习技术被广泛用于在一种情况下学习并将其应用于另一种环境,即将获得的知识和技能应用于新情况。但是,是否可以将学习从深层神经网络转移到较弱的神经网络?是否有可能使用更强大的神经网络获得的知识来提高弱神经网络的性能?在这项工作中,在弱网络的培训过程中,我们添加了一个损失功能,该功能可以最大程度地减少以前从强大的神经网络中学到的功能之间具有弱网络必须尝试学习的功能的功能之间的距离。为了证明我们方法的有效性和鲁棒性,我们使用三个已知数据集进行了大量实验,并证明,如果其学习过程由更强大的神经网络驱动,则弱神经网络可以提高其性能。
The transfer learning technique is widely used to learning in one context and applying it to another, i.e. the capacity to apply acquired knowledge and skills to new situations. But is it possible to transfer the learning from a deep neural network to a weaker neural network? Is it possible to improve the performance of a weak neural network using the knowledge acquired by a more powerful neural network? In this work, during the training process of a weak network, we add a loss function that minimizes the distance between the features previously learned from a strong neural network with the features that the weak network must try to learn. To demonstrate the effectiveness and robustness of our approach, we conducted a large number of experiments using three known datasets and demonstrated that a weak neural network can increase its performance if its learning process is driven by a more powerful neural network.