论文标题
侧通道攻击中增强的卷积神经网络及其可视化
An Enhanced Convolutional Neural Network in Side-Channel Attacks and Its Visualization
论文作者
论文摘要
近年来,卷积神经网络(CNN)对侧通道社区产生了极大的兴趣。先前的工作表明,CNN具有破坏用掩盖或对同步保护的密码算法的潜力。此前,与传统的侧通道攻击(SCA)相比,几种CNN模型已被利用,达到相同甚至更高的性能水平。在本文中,我们研究了剩余网络的体系结构,并构建了一种名为“注意网络”的新CNN模型。为了增强注意力网络的功能,我们引入了注意机制 - 卷积块注意模块(CBAM),并将CBAM纳入CNN体系结构。 CBAM指出了输入轨迹的信息点,并使注意力网络集中在测量的相关泄漏上。它能够提高CNN的性能。因为无关紧要的观点会引入额外的噪音,并引起攻击性能较差。我们将注意力网络与本文中掩盖AES实施的设计网络进行了比较。我们表明,注意力网络的性能比Ascad网络更好。最后,提出了一种新的可视化方法,命名为类梯度可视化(CGV),以识别输入轨迹的哪些点对神经网络的预测结果具有积极影响。在另一个方面,它可以解释为什么注意力网络优于Ascad网络。我们通过在四个公共数据集上的广泛实验来验证注意力网络,并证明注意力网络在不同的AES实现中有效。
In recent years, the convolutional neural networks (CNNs) have received a lot of interest in the side-channel community. The previous work has shown that CNNs have the potential of breaking the cryptographic algorithm protected with masking or desynchronization. Before, several CNN models have been exploited, reaching the same or even better level of performance compared to the traditional side-channel attack (SCA). In this paper, we investigate the architecture of Residual Network and build a new CNN model called attention network. To enhance the power of the attention network, we introduce an attention mechanism - Convolutional Block Attention Module (CBAM) and incorporate CBAM into the CNN architecture. CBAM points out the informative points of the input traces and makes the attention network focus on the relevant leakages of the measurements. It is able to improve the performance of the CNNs. Because the irrelevant points will introduce the extra noises and cause a worse performance of attacks. We compare our attention network with the one designed for the masking AES implementation called ASCAD network in this paper. We show that the attention network has a better performance than the ASCAD network. Finally, a new visualization method, named Class Gradient Visualization (CGV) is proposed to recognize which points of the input traces have a positive influence on the predicted result of the neural networks. In another aspect, it can explain why the attention network is superior to the ASCAD network. We validate the attention network through extensive experiments on four public datasets and demonstrate that the attention network is efficient in different AES implementations.