论文标题
神经网络修剪的状态是什么?
What is the State of Neural Network Pruning?
论文作者
论文摘要
神经网络修剪---通过删除参数减少网络大小的任务 - 近年来一直是大量工作的主题。我们提供了文献的荟萃分析,包括概述文献中修剪和一致发现的方法。在汇总了81篇论文的结果并在受控条件下修剪数百个模型之后,我们最清楚的发现是社区缺乏标准化的基准和指标。这种缺陷足够大,很难将修剪技术彼此进行比较或确定该领域在过去三十年中取得的进展。为了解决这种情况,我们确定了当前实践的问题,建议混凝土疗法,并引入Shrinkbench,这是一个开源框架,以促进对修剪方法的标准化评估。我们使用Shrinkbench比较各种修剪技术,并表明其全面评估可以在比较修剪方法时预防常见的陷阱。
Neural network pruning---the task of reducing the size of a network by removing parameters---has been the subject of a great deal of work in recent years. We provide a meta-analysis of the literature, including an overview of approaches to pruning and consistent findings in the literature. After aggregating results across 81 papers and pruning hundreds of models in controlled conditions, our clearest finding is that the community suffers from a lack of standardized benchmarks and metrics. This deficiency is substantial enough that it is hard to compare pruning techniques to one another or determine how much progress the field has made over the past three decades. To address this situation, we identify issues with current practices, suggest concrete remedies, and introduce ShrinkBench, an open-source framework to facilitate standardized evaluations of pruning methods. We use ShrinkBench to compare various pruning techniques and show that its comprehensive evaluation can prevent common pitfalls when comparing pruning methods.