论文标题

通过协作蒸馏和内核分解来压缩面部化妆转移网络

Compressing Facial Makeup Transfer Networks by Collaborative Distillation and Kernel Decomposition

论文作者

Yang, Bianjiang, Hui, Zi, Hu, Haoji, Hu, Xinyi, Yu, Lu

论文摘要

尽管面部化妆转移网络在生成感知令人愉悦的化妆图像方面取得了高质量的性能,但其功能仍受到网络体系结构的大量计算和存储的限制。我们通过通过协作蒸馏和内核分解来压缩面部化妆转移网络来解决这个问题。协作蒸馏的主要思想是基于一个发现,即编码器配对构建了独家协作关系,该关系被视为低级视觉任务的一种新知识。对于内核分解,我们应用卷积内核的深度分离来构建原始网络的轻加权卷积神经网络(CNN)。广泛的实验显示了将压缩方法应用于最先进的面部化妆转移网络-Beautygan的有效性。

Although the facial makeup transfer network has achieved high-quality performance in generating perceptually pleasing makeup images, its capability is still restricted by the massive computation and storage of the network architecture. We address this issue by compressing facial makeup transfer networks with collaborative distillation and kernel decomposition. The main idea of collaborative distillation is underpinned by a finding that the encoder-decoder pairs construct an exclusive collaborative relationship, which is regarded as a new kind of knowledge for low-level vision tasks. For kernel decomposition, we apply the depth-wise separation of convolutional kernels to build a light-weighted Convolutional Neural Network (CNN) from the original network. Extensive experiments show the effectiveness of the compression method when applied to the state-of-the-art facial makeup transfer network -- BeautyGAN.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源