论文标题
二进制神经网络的复发双线性优化
Recurrent Bilinear Optimization for Binary Neural Networks
论文作者
论文摘要
二进制神经网络(BNNS)对现实世界中嵌入式设备显示出巨大的希望。作为实现强大BNN的关键步骤之一,规模因子计算在减少其实际价值对应物的性能差距方面起着至关重要的作用。但是,现有的BNN忽略了实价重量和尺度因子的固有双线性关系,从而导致训练过程不足引起的亚最佳模型。为了解决这个问题,提出了复发性双线性优化,以通过将固有的双线性变量关联到背面传播过程中,以改善BNNS(Rbonns)的学习过程。我们的工作是从双线性角度优化BNN的首次尝试。具体而言,我们采用经常性的优化和密度 - 重点来依次回溯稀疏的实价过滤器,该过滤器将经过足够的训练并基于可控的学习过程达到其性能限制。我们获得了强大的rbonn,在各种模型和数据集上的最先进的BNN上表现出令人印象深刻的性能。特别是,在对象检测的任务下,rbonn具有出色的概括性能。我们的代码在https://github.com/stevetsui/rbonn上进行开源。
Binary Neural Networks (BNNs) show great promise for real-world embedded devices. As one of the critical steps to achieve a powerful BNN, the scale factor calculation plays an essential role in reducing the performance gap to their real-valued counterparts. However, existing BNNs neglect the intrinsic bilinear relationship of real-valued weights and scale factors, resulting in a sub-optimal model caused by an insufficient training process. To address this issue, Recurrent Bilinear Optimization is proposed to improve the learning process of BNNs (RBONNs) by associating the intrinsic bilinear variables in the back propagation process. Our work is the first attempt to optimize BNNs from the bilinear perspective. Specifically, we employ a recurrent optimization and Density-ReLU to sequentially backtrack the sparse real-valued weight filters, which will be sufficiently trained and reach their performance limits based on a controllable learning process. We obtain robust RBONNs, which show impressive performance over state-of-the-art BNNs on various models and datasets. Particularly, on the task of object detection, RBONNs have great generalization performance. Our code is open-sourced on https://github.com/SteveTsui/RBONN .