论文标题
在贝叶斯优化和积极学习中以双重GP幻想
Fantasizing with Dual GPs in Bayesian Optimization and Active Learning
论文作者
论文摘要
高斯过程(GPS)是用于顺序建模的主要替代功能,例如贝叶斯优化和主动学习。他们的缺点是通过数据扩展不佳,并且在使用非高斯可能性时需要运行优化循环。在本文中,我们专注于“幻想”批处理采集功能,这些功能需要能够在新的幻想数据上有效地调理。通过使用稀疏的双GP参数化,我们获得了批处理大小的线性缩放,以及一步更新非高斯的可能性,从而将稀疏模型扩展到贪婪的批处理幻想幻想的采集功能。
Gaussian processes (GPs) are the main surrogate functions used for sequential modelling such as Bayesian Optimization and Active Learning. Their drawbacks are poor scaling with data and the need to run an optimization loop when using a non-Gaussian likelihood. In this paper, we focus on `fantasizing' batch acquisition functions that need the ability to condition on new fantasized data computationally efficiently. By using a sparse Dual GP parameterization, we gain linear scaling with batch size as well as one-step updates for non-Gaussian likelihoods, thus extending sparse models to greedy batch fantasizing acquisition functions.