论文标题

通过量子噪声,量子分类器对对抗示例的认证鲁棒性

Certified Robustness of Quantum Classifiers against Adversarial Examples through Quantum Noise

论文作者

Huang, Jhih-Cing, Tsai, Yu-Lin, Yang, Chao-Han Huck, Su, Cheng-Fang, Yu, Chia-Mu, Chen, Pin-Yu, Kuo, Sy-Yen

论文摘要

最近,发现量子分类器容易受到对抗性攻击的影响,在这种攻击中,量子分类器被不可察觉的噪声欺骗,导致错误分类。在本文中,我们提出了第一项理论研究,该研究表明,添加量子随机旋转噪声可以改善量子分类器对对抗攻击的鲁棒性。我们将差异隐私的定义链接起来,并表明用自然存在添加噪声训练的量子分类器在私有方面是私人的。最后,我们得出了一个经过认证的鲁棒性,以使量子分类器能够防御对抗性示例,并得到了IBM 7 Qubits设备的噪声模拟的实验结果的支持。

Recently, quantum classifiers have been found to be vulnerable to adversarial attacks, in which quantum classifiers are deceived by imperceptible noises, leading to misclassification. In this paper, we propose the first theoretical study demonstrating that adding quantum random rotation noise can improve robustness in quantum classifiers against adversarial attacks. We link the definition of differential privacy and show that the quantum classifier trained with the natural presence of additive noise is differentially private. Finally, we derive a certified robustness bound to enable quantum classifiers to defend against adversarial examples, supported by experimental results simulated with noises from IBM's 7-qubits device.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源