论文标题

斯坦变异的高斯流程

Stein Variational Gaussian Processes

论文作者

Pinder, Thomas, Nemeth, Christopher, Leslie, David

论文摘要

我们展示了如何使用Stein变分梯度下降(SVGD)来推断具有非高斯可能性和较大数据量的高斯过程(GP)模型。在这些情况下,马尔可夫链蒙特卡洛(MCMC)在计算上是极其计算密集的,但是当遇到多模式后验分布时,有效变异推理(VI)所需的参数假设会导致不正确的推断。 SVGD提供了一种非参数替代方案,用于变异推理,其比MCMC更快。我们证明,对于具有Lipschitz梯度的GP模型,SVGD算法单调地降低了kullback-leibler差异从采样分布到真实后验。我们的方法在回归和分类中的基准问题,多模式后部以及一个空气质量示例中具有550,134个时空观测,显示出对MCMC和VI的实质性提高。

We show how to use Stein variational gradient descent (SVGD) to carry out inference in Gaussian process (GP) models with non-Gaussian likelihoods and large data volumes. Markov chain Monte Carlo (MCMC) is extremely computationally intensive for these situations, but the parametric assumptions required for efficient variational inference (VI) result in incorrect inference when they encounter the multi-modal posterior distributions that are common for such models. SVGD provides a non-parametric alternative to variational inference which is substantially faster than MCMC. We prove that for GP models with Lipschitz gradients the SVGD algorithm monotonically decreases the Kullback-Leibler divergence from the sampling distribution to the true posterior. Our method is demonstrated on benchmark problems in both regression and classification, a multimodal posterior, and an air quality example with 550,134 spatiotemporal observations, showing substantial performance improvements over MCMC and VI.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源