论文标题

通过凸优化的Wasserstein梯度方向的最佳神经网络近似

Optimal Neural Network Approximation of Wasserstein Gradient Direction via Convex Optimization

论文作者

Wang, Yifei, Chen, Peng, Pilanci, Mert, Li, Wuchen

论文摘要

Wasserstein梯度方向的计算对于后验采样问题和科学计算至关重要。 Wasserstein梯度与有限样品的近似需要解决变异问题。我们研究具有平方relu激活的两层网络家族中的变异问题,我们将获得半明确的编程(SDP)放松。该SDP可以被视为在包括两层网络的更广泛函数家族中Wasserstein梯度的近似值。通过求解凸SDP,我们获得了此类函数中Wasserstein梯度方向的最佳近似。数值实验包括COVID-19建模中的PDE受限的贝叶斯推断和参数估计,证明了该方法的有效性。

The computation of Wasserstein gradient direction is essential for posterior sampling problems and scientific computing. The approximation of the Wasserstein gradient with finite samples requires solving a variational problem. We study the variational problem in the family of two-layer networks with squared-ReLU activations, towards which we derive a semi-definite programming (SDP) relaxation. This SDP can be viewed as an approximation of the Wasserstein gradient in a broader function family including two-layer networks. By solving the convex SDP, we obtain the optimal approximation of the Wasserstein gradient direction in this class of functions. Numerical experiments including PDE-constrained Bayesian inference and parameter estimation in COVID-19 modeling demonstrate the effectiveness of the proposed method.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源