论文标题
在未知搜索空间中的贝叶斯优化的子线性后悔范围
Sub-linear Regret Bounds for Bayesian Optimisation in Unknown Search Spaces
论文作者
论文摘要
贝叶斯优化是一种有效优化昂贵的黑盒功能的流行方法。传统上,BO认为搜索空间是已知的。但是,在许多问题中,这个假设不存在。为此,我们提出了一种新颖的BO算法,该算法扩展(并移动)基于控制膨胀速率的超声谐波系列而超过迭代的搜索空间。此外,我们提出了算法的另一种变体,以扩展到高维度。从理论上讲,我们表明,对于我们的算法,累积的遗憾都以次线性速率增长。我们使用合成和现实世界优化任务进行的实验证明了我们的算法优于未知搜索空间中贝叶斯优化的当前最新方法。
Bayesian optimisation is a popular method for efficient optimisation of expensive black-box functions. Traditionally, BO assumes that the search space is known. However, in many problems, this assumption does not hold. To this end, we propose a novel BO algorithm which expands (and shifts) the search space over iterations based on controlling the expansion rate thought a hyperharmonic series. Further, we propose another variant of our algorithm that scales to high dimensions. We show theoretically that for both our algorithms, the cumulative regret grows at sub-linear rates. Our experiments with synthetic and real-world optimisation tasks demonstrate the superiority of our algorithms over the current state-of-the-art methods for Bayesian optimisation in unknown search space.