论文标题
带有序列感知编码的经过复发自动编码器
Recurrent autoencoder with sequence-aware encoding
论文作者
论文摘要
上十年来,经常性的神经网络(RNN)受到了广泛的关注。最近,经常性自动编码器(RAE)的体系结构在实践中发现了许多应用程序。 RAE可以提取具有语义上有价值的信息,称为上下文,代表一个可用于进一步处理的潜在空间。然而,经常性的自动编码器很难训练,培训过程需要很多时间。在本文中,我们提出了一种具有序列意识编码的自动编码器体系结构,该体系结构采用1D卷积层来改善其在模型训练时间方面的性能。我们证明,在大多数情况下,用序列意识编码的复发自动编码器在训练速度方面的表现优于标准RAE。初步结果表明,所提出的解决方案在标准RAE上占主导地位,训练过程的数量级更快。
Recurrent Neural Networks (RNN) received a vast amount of attention last decade. Recently, the architectures of Recurrent AutoEncoders (RAE) found many applications in practice. RAE can extract the semantically valuable information, called context that represents a latent space useful for further processing. Nevertheless, recurrent autoencoders are hard to train, and the training process takes much time. In this paper, we propose an autoencoder architecture with sequence-aware encoding, which employs 1D convolutional layer to improve its performance in terms of model training time. We prove that the recurrent autoencoder with sequence-aware encoding outperforms a standard RAE in terms of training speed in most cases. The preliminary results show that the proposed solution dominates over the standard RAE, and the training process is order of magnitude faster.