论文标题

知识蒸馏应用于光通道均衡:解决经常性连接的并行化问题

Knowledge Distillation Applied to Optical Channel Equalization: Solving the Parallelization Problem of Recurrent Connection

论文作者

Srivallapanondh, Sasipim, Freire, Pedro J., Spinnler, Bernhard, Costa, Nelson, Napoli, Antonio, Turitsyn, Sergei K., Prilepsky, Jaroslaw E.

论文摘要

为了避免复发性神经网络均衡器的非平行性,我们提出知识蒸馏以将RNN重塑为可行的前馈结构。后者显示出38 \%的潜伏期降低,同时仅影响0.5dB。

To circumvent the non-parallelizability of recurrent neural network-based equalizers, we propose knowledge distillation to recast the RNN into a parallelizable feedforward structure. The latter shows 38\% latency decrease, while impacting the Q-factor by only 0.5dB.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源