论文标题

关于有限的互信息

On Finite-Time Mutual Information

论文作者

Zhu, Jieao, Zhang, Zijian, Wan, Zhongzhichao, Dai, Linglong

论文摘要

当信号观察时间无限时,Shannon-Hartley定理可以准确计算通道容量。但是,尚不清楚的有限时间共同信息的计算对于指导实用通信系统的设计至关重要。在本文中,我们研究了有限的观察窗口中两个相关的高斯过程之间的共同信息。我们首先通过提供极限表达来得出有限的时间相互信息。然后,我们在单个有限的时间窗口中数字计算共同信息。我们透露,在有限的时间窗口内每秒传输的位数可以超过整个时间轴的共同信息,这称为超平均现象。此外,我们通过利用痕量类操作员的Mercer扩展,在典型的信号自相关案例下得出有限的时间相互信息公式,并揭示了有限的时间相互信息问题与操作员理论之间的联系。最后,在这种典型情况下,我们从分析证明了超平均现象的存在,并证明了其与香农能力的兼容性。

Shannon-Hartley theorem can accurately calculate the channel capacity when the signal observation time is infinite. However, the calculation of finite-time mutual information, which remains unknown, is essential for guiding the design of practical communication systems. In this paper, we investigate the mutual information between two correlated Gaussian processes within a finite-time observation window. We first derive the finite-time mutual information by providing a limit expression. Then we numerically compute the mutual information within a single finite-time window. We reveal that the number of bits transmitted per second within the finite-time window can exceed the mutual information averaged over the entire time axis, which is called the exceed-average phenomenon. Furthermore, we derive a finite-time mutual information formula under a typical signal autocorrelation case by utilizing the Mercer expansion of trace class operators, and reveal the connection between the finite-time mutual information problem and the operator theory. Finally, we analytically prove the existence of the exceed-average phenomenon in this typical case, and demonstrate its compatibility with the Shannon capacity.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源