Journal of Information Security Reserach ›› 2022, Vol. 8 ›› Issue (8): 812-.
Previous Articles Next Articles
Online:
Published:
周梓馨,张功萱,寇小勇,杨威
通讯作者:
作者简介:
Abstract: Because deep learning can freely extract and combine features, an increasing number of academics are using it to perform sidechannel attacks without taking into consideration preprocessing processes like choosing sites of interest and alignment. The sidechannel attack model based on deep learning is built with multilayer perceptron networks, convolution neural networks, and recurrent neural networks, but it has several issues in the training stage, such as overfitting, gradient disappearance, and sluggish convergence speed. Meanwhile, the selfattention mechanism is capable of extracting characteristics in natural language processing, computer vision, and other domains. To make the selfattentiveness mechanism accessible to the area of deep learning sidechannel attacks, we present SADLSCA, a deep learning sidechannel attack model based on the selfattentiveness mechanism, based on the features of deep learningbased sidechannel attacks. SADLSCA addresses the issues of fast overfitting, gradient disappearance, and slow convergence of deep learningbased sidechannel attack models during training, and experimentally verifies that the energy traces required for a successful attack on public datasets ASCAD and CHES CTF 2018 are reduced by 23.1% and 41.7%, respectively.
Key words: deep learning, side channel attack, selfattention mechanism, neural network, modeling attack
摘要: 深度学习可以自由地提取组合特征,基于深度学习的侧信道攻击方法避免了选取兴趣点和对齐等预处理操作,促使越来越多的研究者使用深度学习实施侧信道攻击.目前基于深度学习的侧信道攻击模型使用多层感知机网络、卷积神经网络和循环神经网络,在训练时存在快速过拟合、梯度消失和收敛速度慢等问题.自注意力机制在自然语言处理、计算机视觉等领域表现出强大的特征提取能力.深入剖析自注意力机制的原理后,根据基于深度学习的侧信道攻击特质,提出了基于自注意力机制的深度学习侧信道攻击模型SADLSCA,使自注意力机制适用于深度学习侧信道攻击领域.SADLSCA充分地发挥自注意力机制以全局视角提取兴趣点的优点,解决了基于深度学习的侧信道攻击模型在训练时存在的快速过拟合、梯度消失和收敛速度慢等问题,并通过实验验证了在公开数据集ASCAD和CHES CTF 2018上攻击成功所需要的能量迹数量分别减少了23.1%和41.7%.
关键词: 深度学习, 侧信道攻击, 自注意力机制, 神经网络, 建模攻击
周梓馨, 张功萱, 寇小勇, 杨威. 一种基于自注意力机制的深度学习侧信道攻击方法[J]. 信息安全研究, 2022, 8(8): 812-.
0 / / Recommend
Add to citation manager EndNote|Ris|BibTeX
URL: http://www.sicris.cn/EN/
http://www.sicris.cn/EN/Y2022/V8/I8/812