论文标题

神经遗传关系指导的单次层分配搜索

Neural Inheritance Relation Guided One-Shot Layer Assignment Search

论文作者

Meng, Rang, Chen, Weijie, Xie, Di, Zhang, Yuan, Pu, Shiliang

论文摘要

层分配很少作为神经体系结构搜索中的独立研究主题。在本文中,我们首次通过在CIFAR-100上构建图层分配的架构数据集,系统地研究了不同层分配对网络性能的影响。通过分析该数据集,我们发现了具有不同层分配的网络之间的神经继承关系,即,更深层网络的最佳层分配始终从浅层网络中继承。受这种神经遗传关系的启发,我们通过继承的采样提出了一种有效的单次层分配搜索方法。具体而言,可以在浅网络中搜索的最佳层分配作为强大的采样,以训练和搜索超级网中的更深层次,从而极大地降低了网络搜索空间。在CIFAR-100上进行的全面实验说明了我们提出的方法的效率。我们的搜索结果与直接从体系结构数据集中选择的最佳结果一致。为了进一步确认我们提出的方法的概括,我们还对小象征和成像网进行了实验。我们的搜索结果非常优于手工制作的结果,在不变的计算预算下。本文发现的神经遗传关系可以为普遍的神经体系结构搜索提供见解。

Layer assignment is seldom picked out as an independent research topic in neural architecture search. In this paper, for the first time, we systematically investigate the impact of different layer assignments to the network performance by building an architecture dataset of layer assignment on CIFAR-100. Through analyzing this dataset, we discover a neural inheritance relation among the networks with different layer assignments, that is, the optimal layer assignments for deeper networks always inherit from those for shallow networks. Inspired by this neural inheritance relation, we propose an efficient one-shot layer assignment search approach via inherited sampling. Specifically, the optimal layer assignment searched in the shallow network can be provided as a strong sampling priori to train and search the deeper ones in supernet, which extremely reduces the network search space. Comprehensive experiments carried out on CIFAR-100 illustrate the efficiency of our proposed method. Our search results are strongly consistent with the optimal ones directly selected from the architecture dataset. To further confirm the generalization of our proposed method, we also conduct experiments on Tiny-ImageNet and ImageNet. Our searched results are remarkably superior to the handcrafted ones under the unchanged computational budgets. The neural inheritance relation discovered in this paper can provide insights to the universal neural architecture search.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源