论文标题
没有反向传播的优化
Optimization without Backpropagation
论文作者
论文摘要
最近引入了向前的梯度,以绕过自动化的反向传播,同时保留了真实梯度的无偏估计量。我们得出了最佳的条件,以获得最佳的近似向前梯度,这使我们获得了数学见解,这些洞察力表明在高维度中优化在前梯度方面具有挑战性。我们对测试功能的广泛实验支持了这一主张。
Forward gradients have been recently introduced to bypass backpropagation in autodifferentiation, while retaining unbiased estimators of true gradients. We derive an optimality condition to obtain best approximating forward gradients, which leads us to mathematical insights that suggest optimization in high dimension is challenging with forward gradients. Our extensive experiments on test functions support this claim.