论文标题

QSAN:近期可实现的量子自我发项网络

QSAN: A Near-term Achievable Quantum Self-Attention Network

论文作者

Shi, Jinjing, Zhao, Ren-Xin, Wang, Wenxuan, Zhang, Shichao, Li, Xuelong

论文摘要

自我发项机制(SAM)擅长捕获特征的内部连接,并大大提高了机器学习模型的性能,特别是需要有效的表征和提取高维数据的特征。提出了一种新型的量子自我发场网络(QSAN),用于近期量子设备上的图像分类任务。首先,探索了量子自我发项机制(QSAM),包括量子逻辑相似性(QLS)和量子位自发项评分矩阵矩阵(QBSASM)作为QSAN的理论基础,以增强SAM的数据表示。使用QLS来防止测量获得内部产物,以使QSAN在量子计算机上充分实现,并且由于Qsan的演变而产生的密度矩阵,QBSASM有效地反映了输出的注意力分布。然后,QSAN的一步实现和量子电路的框架旨在完全考虑测量时间的压缩,以在中间过程中获取QBSASM,其中在量子坐标原型中也引入了量子坐标原型,用于描述输出和控制位之间的数学关系以促进程序的数学关系。最终,在MNIST上使用Pennylane平台进行的方法比较和二进制分类实验表明,Qsan收敛于约1.7倍和2.3倍的速度比硬件有效的ANSATZ和QAOA ANSATZ更快地逆转,并且具有相似的参数配置和100%的预测准确性,这表明它具有更好的学习能力。 QSAN非常适合对图像和其他数据的主要和次要关系的快速和深入分析,从增强模型的信息提取能力的角度,这具有巨大的量子计算机视觉应用的潜力。

Self-Attention Mechanism (SAM) is good at capturing the internal connections of features and greatly improves the performance of machine learning models, espeacially requiring efficient characterization and feature extraction of high-dimensional data. A novel Quantum Self-Attention Network (QSAN) is proposed for image classification tasks on near-term quantum devices. First, a Quantum Self-Attention Mechanism (QSAM) including Quantum Logic Similarity (QLS) and Quantum Bit Self-Attention Score Matrix (QBSASM) is explored as the theoretical basis of QSAN to enhance the data representation of SAM. QLS is employed to prevent measurements from obtaining inner products to allow QSAN to be fully implemented on quantum computers, and QBSASM as a result of the evolution of QSAN to produce a density matrix that effectively reflects the attention distribution of the output. Then, the framework for one-step realization and quantum circuits of QSAN are designed for fully considering the compression of the measurement times to acquire QBSASM in the intermediate process, in which a quantum coordinate prototype is introduced as well in the quantum circuit for describing the mathematical relation between the output and control bits to facilitate programming. Ultimately, the method comparision and binary classification experiments on MNIST with the pennylane platform demonstrate that QSAN converges about 1.7x and 2.3x faster than hardware-efficient ansatz and QAOA ansatz respevtively with similar parameter configurations and 100% prediction accuracy, which indicates it has a better learning capability. QSAN is quite suitable for fast and in-depth analysis of the primary and secondary relationships of image and other data, which has great potential for applications of quantum computer vision from the perspective of enhancing the information extraction ability of models.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源