论文标题

随机归一化流

Stochastic Normalizing Flows

论文作者

Wu, Hao, Köhler, Jonas, Noé, Frank

论文摘要

在机器学习和统计力学中,指定为归一化常数的概率分布的采样是一个重要问题。尽管经典的随机抽样方法,例如马尔可夫链蒙特卡洛(MCMC)或Langevin Dynamics(LD)可能会遭受缓慢的混合时间的影响,但人们对使用标准化流量以了解简单的先验分布到给定的目标分布的转换越来越兴趣。在这里,我们提出了一种广义和组合的方法来样本目标密度:随机归一化流(SNF) - 确定性可逆函数和随机采样块的任意序列。我们表明,随机性克服了由可逆性限制引起的归一流流的表达限制,而采样步骤之间的可训练转换提高了沿流量纯MCMC/LD的效率。通过从非平衡统计力学中调用想法,我们得出了一个有效的训练程序,可以通过该程序来优化采样器和流程的参数,并且我们可以通过它来计算精确的重要性权重,而无需边缘化随机块的随机性。我们说明了SNF在几个基准上的代表力,采样效率和渐近正确性,包括在平衡中采样分子系统的应用。

The sampling of probability distributions specified up to a normalization constant is an important problem in both machine learning and statistical mechanics. While classical stochastic sampling methods such as Markov Chain Monte Carlo (MCMC) or Langevin Dynamics (LD) can suffer from slow mixing times there is a growing interest in using normalizing flows in order to learn the transformation of a simple prior distribution to the given target distribution. Here we propose a generalized and combined approach to sample target densities: Stochastic Normalizing Flows (SNF) -- an arbitrary sequence of deterministic invertible functions and stochastic sampling blocks. We show that stochasticity overcomes expressivity limitations of normalizing flows resulting from the invertibility constraint, whereas trainable transformations between sampling steps improve efficiency of pure MCMC/LD along the flow. By invoking ideas from non-equilibrium statistical mechanics we derive an efficient training procedure by which both the sampler's and the flow's parameters can be optimized end-to-end, and by which we can compute exact importance weights without having to marginalize out the randomness of the stochastic blocks. We illustrate the representational power, sampling efficiency and asymptotic correctness of SNFs on several benchmarks including applications to sampling molecular systems in equilibrium.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源