论文标题

神经网络通过可学习的小波变换压缩

Neural network compression via learnable wavelet transforms

论文作者

Wolter, Moritz, Lin, Shaohui, Yao, Angela

论文摘要

小波以数据压缩而闻名,但很少应用于神经网络的压缩。本文显示了如何使用快速小波变换来压缩神经网络中的线性层。线性层仍然占据了复发神经网络(RNN)中参数的很大一部分。通过我们的方法,我们可以同时学习小波碱基和相应的系数,以有效地表示RNN的线性层。我们的小波压缩RNN的参数明显较少,但仍与最先进的合成和现实世界RNN基准相关。小波优化增加了基础灵活性,而没有大量额外的权重。源代码可从https://github.com/v0lta/wavelet-network-compression获得。

Wavelets are well known for data compression, yet have rarely been applied to the compression of neural networks. This paper shows how the fast wavelet transform can be used to compress linear layers in neural networks. Linear layers still occupy a significant portion of the parameters in recurrent neural networks (RNNs). Through our method, we can learn both the wavelet bases and corresponding coefficients to efficiently represent the linear layers of RNNs. Our wavelet compressed RNNs have significantly fewer parameters yet still perform competitively with the state-of-the-art on synthetic and real-world RNN benchmarks. Wavelet optimization adds basis flexibility, without large numbers of extra weights. Source code is available at https://github.com/v0lta/Wavelet-network-compression.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源