论文标题
基于注意力的量子断层扫描
Attention-based Quantum Tomography
论文作者
论文摘要
随着量子系统平台之间的快速进步,嘈杂量子状态的多体量子状态重建的问题成为一个重要的挑战。最近的著作发现,使用生成神经网络模型来研究量子状态测量向量的概率分布的量子状态重建问题有望。在这里,我们提出了“基于注意力的量子断层扫描”(AQT),这是一种使用基于注意机制的生成网络的量子状态重建,该网络了解嘈杂的量子状态的混合状态密度矩阵。 AQT基于Vishwani等人(2017年)中提出的“注意力是您所需要的”模型,该模型旨在学习自然语言句子中的远程相关性,从而优于以前的自然语言处理模型。我们不仅证明了AQT在相同的任务上优于基于神经网络的早期量子状态重建,而且AQT可以准确地重建与IBMQ量子计算机实验实验的嘈杂量子状态相关的密度矩阵。我们推测AQT的成功源于其在整个量子系统中对量子纠缠建模的能力,因为自然语言处理的注意模型捕获了句子中单词之间的相关性。
With rapid progress across platforms for quantum systems, the problem of many-body quantum state reconstruction for noisy quantum states becomes an important challenge. Recent works found promise in recasting the problem of quantum state reconstruction to learning the probability distribution of quantum state measurement vectors using generative neural network models. Here we propose the "Attention-based Quantum Tomography" (AQT), a quantum state reconstruction using an attention mechanism-based generative network that learns the mixed state density matrix of a noisy quantum state. The AQT is based on the model proposed in "Attention is all you need" by Vishwani et al (2017) that is designed to learn long-range correlations in natural language sentences and thereby outperform previous natural language processing models. We demonstrate not only that AQT outperforms earlier neural-network-based quantum state reconstruction on identical tasks but that AQT can accurately reconstruct the density matrix associated with a noisy quantum state experimentally realized in an IBMQ quantum computer. We speculate the success of the AQT stems from its ability to model quantum entanglement across the entire quantum system much as the attention model for natural language processing captures the correlations among words in a sentence.