论文标题

待流:通过移动速度的时间优化伴随的有效连续归一流的流量

TO-FLOW: Efficient Continuous Normalizing Flows with Temporal Optimization adjoint with Moving Speed

论文作者

Du, Shian, Luo, Yihong, Chen, Wei, Xu, Jian, Zeng, Delu

论文摘要

连续归一化流(CNFS)使用神经常规微分方程(神经ODE)在任意复合物分布和各向同性高斯分布之间的可逆映射。由于神经ode训练的增量复杂性,它在大型数据集中无法处理。最佳运输理论已应用于在最近的作品中加快ode的动力学。在本文中,通过优化神经ode训练的前进时间来提出时间优化。在此Appaach中,我们通过坐标下降将CNF的网络权重交替优化。随着时间正则化,可以确保进化的稳定性。该方法可以与原始正规化方法结合使用。我们在实验上证明,所提出的方法可以显着加速训练,而无需牺牲基线模型的绩效。

Continuous normalizing flows (CNFs) construct invertible mappings between an arbitrary complex distribution and an isotropic Gaussian distribution using Neural Ordinary Differential Equations (neural ODEs). It has not been tractable on large datasets due to the incremental complexity of the neural ODE training. Optimal Transport theory has been applied to regularize the dynamics of the ODE to speed up training in recent works. In this paper, a temporal optimization is proposed by optimizing the evolutionary time for forward propagation of the neural ODE training. In this appoach, we optimize the network weights of the CNF alternately with evolutionary time by coordinate descent. Further with temporal regularization, stability of the evolution is ensured. This approach can be used in conjunction with the original regularization approach. We have experimentally demonstrated that the proposed approach can significantly accelerate training without sacrifying performance over baseline models.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源