论文标题

内核分布在强大的优化方面

Kernel Distributionally Robust Optimization

论文作者

Zhu, Jia-Jie, Jitkrittum, Wittawat, Diehl, Moritz, Schölkopf, Bernhard

论文摘要

我们使用强大的优化理论和功能分析中的见解提出了内核分布强劲优化(内核DRO)。我们的方法使用繁殖的内核希尔伯特空间(RKHS)来构建各种凸模率集,可以将其推广到基于积分概率指标和有限订单矩界的集合。该观点统一了多个现有的鲁棒和随机优化方法。我们证明了一个定理,它在矩的数学问题中概括了经典二元性。由该定理启用,我们将DRO中的措施的最大化重新化为搜索RKHS函数的双重程序。使用通用RKHSS,该定理适用于广泛的损失函数,取消了共同的局限性,例如多项式损失和Lipschitz常数的知识。然后,我们在DRO和随机优化之间建立了与期望约束的联系。最后,我们根据批处理凸求解器和随机功能梯度提出实用算法,这些算法适用于一般优化和机器学习任务。

We propose kernel distributionally robust optimization (Kernel DRO) using insights from the robust optimization theory and functional analysis. Our method uses reproducing kernel Hilbert spaces (RKHS) to construct a wide range of convex ambiguity sets, which can be generalized to sets based on integral probability metrics and finite-order moment bounds. This perspective unifies multiple existing robust and stochastic optimization methods. We prove a theorem that generalizes the classical duality in the mathematical problem of moments. Enabled by this theorem, we reformulate the maximization with respect to measures in DRO into the dual program that searches for RKHS functions. Using universal RKHSs, the theorem applies to a broad class of loss functions, lifting common limitations such as polynomial losses and knowledge of the Lipschitz constant. We then establish a connection between DRO and stochastic optimization with expectation constraints. Finally, we propose practical algorithms based on both batch convex solvers and stochastic functional gradient, which apply to general optimization and machine learning tasks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源