论文标题
自适应稀疏结构开发,修剪和再生用于尖峰神经网络
Adaptive Sparse Structure Development with Pruning and Regeneration for Spiking Neural Networks
论文作者
论文摘要
尖峰神经网络(SNN)在生物学上更合理,并且在计算上有效。因此,SNN具有绘制大脑发育的稀疏结构可塑性的自然优势,以减轻其复杂和固定结构引起的深神经网络的能量问题。但是,以前的SNNS压缩作品缺乏大脑发育可塑性机制的深入灵感。本文提出了一种新的SNN自适应结构发育(SD-SNN)的方法,引入了基于树突状脊柱可塑性的突触约束,神经元修剪和突触再生的方法。我们发现,突触约束和神经元修剪可以检测和去除SNN中的大量冗余,并与突触再生结合可以有效预防和修复过度缩短。此外,受神经营养假设的启发,神经元修剪率和突触再生率在学习期间的过程中被自适应调整,这最终导致了SNN的结构稳定性。空间(MNIST,CIFAR-10)和时间神经形态(N-MNIST,DVS-GETURE)数据集的实验结果表明,我们的方法可以灵活地学习各种任务的适当压缩率,并有效地实现卓越的性能,同时大量降低网络能量消耗。具体而言,对于空间MNIST数据集,我们的SD-SNN以49.83 \%的修剪率达到99.51 \%的精度,与基线相比,它的精度为0.05 \%,而没有压缩。对于神经形态的DVS键数据集,当压缩率达到55.50 \%时,通过我们的方法可实现98.20 \%精度,提高了1.09 \%。
Spiking Neural Networks (SNNs) are more biologically plausible and computationally efficient. Therefore, SNNs have the natural advantage of drawing the sparse structural plasticity of brain development to alleviate the energy problems of deep neural networks caused by their complex and fixed structures. However, previous SNNs compression works are lack of in-depth inspiration from the brain development plasticity mechanism. This paper proposed a novel method for the adaptive structural development of SNN (SD-SNN), introducing dendritic spine plasticity-based synaptic constraint, neuronal pruning and synaptic regeneration. We found that synaptic constraint and neuronal pruning can detect and remove a large amount of redundancy in SNNs, coupled with synaptic regeneration can effectively prevent and repair over-pruning. Moreover, inspired by the neurotrophic hypothesis, neuronal pruning rate and synaptic regeneration rate were adaptively adjusted during the learning-while-pruning process, which eventually led to the structural stability of SNNs. Experimental results on spatial (MNIST, CIFAR-10) and temporal neuromorphic (N-MNIST, DVS-Gesture) datasets demonstrate that our method can flexibly learn appropriate compression rate for various tasks and effectively achieve superior performance while massively reducing the network energy consumption. Specifically, for the spatial MNIST dataset, our SD-SNN achieves 99.51\% accuracy at the pruning rate 49.83\%, which has a 0.05\% accuracy improvement compared to the baseline without compression. For the neuromorphic DVS-Gesture dataset, 98.20\% accuracy with 1.09\% improvement is achieved by our method when the compression rate reaches 55.50\%.