论文标题

Seril:使用基于正则化的增量学习的噪声自适应语音增强

SERIL: Noise Adaptive Speech Enhancement using Regularization-based Incremental Learning

论文作者

Lee, Chi-Chang, Lin, Yu-Chen, Lin, Hsuan-Tien, Wang, Hsin-Min, Tsao, Yu

论文摘要

已经提出了许多噪声适应技术,以在不匹配的噪声环境中微调语音增强(SE)中的深度学习模型。然而,对新环境的适应可能会导致灾难性忘记以前学到的环境。灾难性的遗忘问题降低了SE在现实世界中嵌入式设备中的性能,后者经常重新访问以前的噪声环境。嵌入式设备的性质不允许通过所有预训练模型或更早的培训数据的额外存储解决问题。在本文中,我们提出了一种基于正则化的增量学习SE(SERIL)策略,在不使用额外存储的情况下补充了现有的噪声适应策略。借助正规化约束,参数将更新为新的噪声环境,同时保留了先前噪声环境的知识。实验结果表明,当面对新的噪声域时,Seril模型的表现优于未适应的SE模型。同时,与基于微调的当前自适应技术相比,Seril模型可以将忘记以前的噪声环境降低52%。结果验证了Seril模型可以有效地调整为新的噪声环境,同时克服灾难性遗忘问题。结果使得Seril成为现实世界中的应用程序的有利选择,其中噪声环境经常变化。

Numerous noise adaptation techniques have been proposed to fine-tune deep-learning models in speech enhancement (SE) for mismatched noise environments. Nevertheless, adaptation to a new environment may lead to catastrophic forgetting of the previously learned environments. The catastrophic forgetting issue degrades the performance of SE in real-world embedded devices, which often revisit previous noise environments. The nature of embedded devices does not allow solving the issue with additional storage of all pre-trained models or earlier training data. In this paper, we propose a regularization-based incremental learning SE (SERIL) strategy, complementing existing noise adaptation strategies without using additional storage. With a regularization constraint, the parameters are updated to the new noise environment while retaining the knowledge of the previous noise environments. The experimental results show that, when faced with a new noise domain, the SERIL model outperforms the unadapted SE model. Meanwhile, compared with the current adaptive technique based on fine-tuning, the SERIL model can reduce the forgetting of previous noise environments by 52%. The results verify that the SERIL model can effectively adjust itself to new noise environments while overcoming the catastrophic forgetting issue. The results make SERIL a favorable choice for real-world SE applications, where the noise environment changes frequently.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源