论文标题

培训具有随机初始化的固定重量的神经网络中的高效连接性

Training highly effective connectivities within neural networks with randomly initialized, fixed weights

论文作者

Ivan, Cristian, Florian, Razvan

论文摘要

我们提出了一些新颖,直接的方法,用于训练随机初始化的神经网络的连接图,而无需训练权重。这些方法不使用定义截止阈值的超参数,因此消除了对此类超参数的最佳值的迭代搜索需求。与训练所有权重相比,我们可以取得相似或更高的性能,其计算成本与标准培训技术相似。除了打开和关闭连接外,我们还通过翻转权重标志来介绍一种训练网络的新型方式。如果我们试图通过更改总数的不到10%来最大程度地减少变化的连接数量,则已经有可能达到标准培训所达到的准确性的90%以上。即使是恒定的重量,我们也获得了良好的结果,甚至从高度不对称的分布中得出权重。这些结果阐明了神经网络的过度参数化以及如何将它们降低到其有效大小。

We present some novel, straightforward methods for training the connection graph of a randomly initialized neural network without training the weights. These methods do not use hyperparameters defining cutoff thresholds and therefore remove the need for iteratively searching optimal values of such hyperparameters. We can achieve similar or higher performances than in the case of training all weights, with a similar computational cost as for standard training techniques. Besides switching connections on and off, we introduce a novel way of training a network by flipping the signs of the weights. If we try to minimize the number of changed connections, by changing less than 10% of the total it is already possible to reach more than 90% of the accuracy achieved by standard training. We obtain good results even with weights of constant magnitude or even when weights are drawn from highly asymmetric distributions. These results shed light on the over-parameterization of neural networks and on how they may be reduced to their effective size.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源