论文标题

合成和修剪作为有效深神网络的动态压缩策略

Synthesis and Pruning as a Dynamic Compression Strategy for Efficient Deep Neural Networks

论文作者

Finlinson, Alastair, Moschoyiannis, Sotiris

论文摘要

大脑是一台能够针对特定任务适应的可重新配置的机器。大脑不断重新布线以进行更佳的配置以解决问题。我们为前馈网络提出了一种新颖的战略合成算法,该算法在学习时直接从大脑的行为中汲取灵感。提出的方法根据其幅度分析了网络并对权重进行对。与主张随机选择的现有方法不同,我们选择高度性能的节点作为新边缘的起点,并在权重上利用高斯分布来选择相应的端点。该策略旨在产生有用的连接并导致较小的残留网络结构。该方法与修剪以进一步的压缩相辅相成。我们演示了深馈网络的技术。由这项工作中的合成方法形成的残留子网络构成了共同的子网络,其相似性高达〜90%。使用修剪作为战略合成方法的补充,我们观察到压缩的改善。

The brain is a highly reconfigurable machine capable of task-specific adaptations. The brain continually rewires itself for a more optimal configuration to solve problems. We propose a novel strategic synthesis algorithm for feedforward networks that draws directly from the brain's behaviours when learning. The proposed approach analyses the network and ranks weights based on their magnitude. Unlike existing approaches that advocate random selection, we select highly performing nodes as starting points for new edges and exploit the Gaussian distribution over the weights to select corresponding endpoints. The strategy aims only to produce useful connections and result in a smaller residual network structure. The approach is complemented with pruning to further the compression. We demonstrate the techniques to deep feedforward networks. The residual sub-networks that are formed from the synthesis approaches in this work form common sub-networks with similarities up to ~90%. Using pruning as a complement to the strategic synthesis approach, we observe improvements in compression.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源