论文标题
配对网:在分区子空间上的新型快速浅的人工神经网络
PairNets: Novel Fast Shallow Artificial Neural Networks on Partitioned Subspaces
论文作者
论文摘要
传统上,人工神经网络(ANN)通过梯度下降算法(例如反向传播算法)缓慢训练,因为大量ANN的超参数需要通过许多训练时期进行微调。为了高度加快训练,我们创建了一种新型的浅4层ANN,称为“成对神经网络”(“ Pairnet”),具有高速超参数优化。另外,将每个输入的值分为多个间隔,然后将N维空间分配到M n维子空间中。 M本地的配对网络构建在分区的本地N维子空间中。仅使用一个时期的一个时代对本地的Pairnet进行非常快速的训练,因为它的超参数通过使用多元最小二乘拟合方法直接求解线性方程系统进行了直接优化。三个回归问题的仿真结果表明,对三个病例,配对网络实现了更高的速度和较低的平均测试平方误差(MSE),而两个病例的平均训练MSE比传统ANN较低。一项重要的未来工作是基于智能方法和并行计算方法来开发更好,更快的优化算法,以优化分区子空间和超参数,以在大数据挖掘和实时机器学习中构建快速有效的配对网络。
Traditionally, an artificial neural network (ANN) is trained slowly by a gradient descent algorithm such as the backpropagation algorithm since a large number of hyperparameters of the ANN need to be fine-tuned with many training epochs. To highly speed up training, we created a novel shallow 4-layer ANN called "Pairwise Neural Network" ("PairNet") with high-speed hyperparameter optimization. In addition, a value of each input is partitioned into multiple intervals, and then an n-dimensional space is partitioned into M n-dimensional subspaces. M local PairNets are built in M partitioned local n-dimensional subspaces. A local PairNet is trained very quickly with only one epoch since its hyperparameters are directly optimized one-time via simply solving a system of linear equations by using the multivariate least squares fitting method. Simulation results for three regression problems indicated that the PairNet achieved much higher speeds and lower average testing mean squared errors (MSEs) for the three cases, and lower average training MSEs for two cases than the traditional ANNs. A significant future work is to develop better and faster optimization algorithms based on intelligent methods and parallel computing methods to optimize both partitioned subspaces and hyperparameters to build the fast and effective PairNets for applications in big data mining and real-time machine learning.