论文标题
Wasserstein的最低速度方法用于学习非归一化模型
A Wasserstein Minimum Velocity Approach to Learning Unnormalized Models
论文作者
论文摘要
得分匹配提供了一种有效的方法来学习灵活的非标准模型,但其可伸缩性受到评估二阶导数的需要的限制。在本文中,我们通过观察这些目标与Wasserstein梯度流之间的新联系,向包括分数匹配在内的一般学习目标家族提供可扩展的近似。我们在学习神经密度估计器上的应用中提出了有希望的应用程序,并在训练中隐含的变化和瓦斯汀自动编码器具有歧管值的先验。
Score matching provides an effective approach to learning flexible unnormalized models, but its scalability is limited by the need to evaluate a second-order derivative. In this paper, we present a scalable approximation to a general family of learning objectives including score matching, by observing a new connection between these objectives and Wasserstein gradient flows. We present applications with promise in learning neural density estimators on manifolds, and training implicit variational and Wasserstein auto-encoders with a manifold-valued prior.