论文标题

几乎是正交的层,用于有效的通用Lipschitz网络

Almost-Orthogonal Layers for Efficient General-Purpose Lipschitz Networks

论文作者

Prach, Bernd, Lampert, Christoph H.

论文摘要

对于深层网络而言,这是一个非常理想的属性,可以强大地抵御小型输入更改。实现这一属性的一种流行方法是设计具有小lipschitz常数的网络。在这项工作中,我们提出了一种用于构建具有许多理想特性的Lipschitz网络的新技术:它可以应用于任何线性网络层(完全连接或卷积),它在Lipschitz常数上提供正式保证,它可以易于实现并可以与任何训练目标和优化方法相结合,并且可以将其结合起来。实际上,我们的技术是文献中第一个同时实现所有这些属性的技术。我们的主要贡献是基于重新的重量矩阵参数化,该参数确保每个网络层最多具有LIPSCHITZ常数,并导致学习的权重矩阵接近正交。因此,我们将这种层称为几乎是正交的Lipschitz(AOL)。在图像分类的背景下,实验和消融研究具有认证的鲁棒精度证实,AOL层可实现与大多数现有方法相当的结果。但是,它们更容易实现,并且更广泛地适用,因为它们不需要计算昂贵的矩阵正交化或反转步骤作为网络体系结构的一部分。我们在https://github.com/berndprach/aol上提供代码。

It is a highly desirable property for deep networks to be robust against small input changes. One popular way to achieve this property is by designing networks with a small Lipschitz constant. In this work, we propose a new technique for constructing such Lipschitz networks that has a number of desirable properties: it can be applied to any linear network layer (fully-connected or convolutional), it provides formal guarantees on the Lipschitz constant, it is easy to implement and efficient to run, and it can be combined with any training objective and optimization method. In fact, our technique is the first one in the literature that achieves all of these properties simultaneously. Our main contribution is a rescaling-based weight matrix parametrization that guarantees each network layer to have a Lipschitz constant of at most 1 and results in the learned weight matrices to be close to orthogonal. Hence we call such layers almost-orthogonal Lipschitz (AOL). Experiments and ablation studies in the context of image classification with certified robust accuracy confirm that AOL layers achieve results that are on par with most existing methods. Yet, they are simpler to implement and more broadly applicable, because they do not require computationally expensive matrix orthogonalization or inversion steps as part of the network architecture. We provide code at https://github.com/berndprach/AOL.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源