论文标题
通用神经最佳运输
Universal Neural Optimal Transport
论文作者
论文摘要
最佳运输(OT)问题是许多应用程序的基石,但是解决它们在计算上是昂贵的。为了解决这个问题,我们提出了UNOT(通用神经最佳运输),这是一个能够准确预测(熵)距离的新型框架和给定成本函数的离散度量之间的计划。 UNOT建立在傅立叶神经操作员上,这是一种通用的神经网络,它们在功能空间之间绘制且具有离散化不变的神经网络,这使我们的网络能够处理可变分辨率的度量。该网络是使用第二个,生成网络和自我监视的引导损失对对手进行训练的。我们在一个广泛的理论框架中扎根。通过对欧几里得和非欧几里得领域的实验,我们表明我们的网络不仅可以准确地预测各种数据集的OT距离和计划,而且还可以正确捕获Wasserstein空间的几何形状。此外,我们表明我们的网络可以用作sndhorn算法的最新初始化,其加速度高达$ 7.4 \ times $,大大优于现有方法。
Optimal Transport (OT) problems are a cornerstone of many applications, but solving them is computationally expensive. To address this problem, we propose UNOT (Universal Neural Optimal Transport), a novel framework capable of accurately predicting (entropic) OT distances and plans between discrete measures for a given cost function. UNOT builds on Fourier Neural Operators, a universal class of neural networks that map between function spaces and that are discretization-invariant, which enables our network to process measures of variable resolutions. The network is trained adversarially using a second, generating network and a self-supervised bootstrapping loss. We ground UNOT in an extensive theoretical framework. Through experiments on Euclidean and non-Euclidean domains, we show that our network not only accurately predicts OT distances and plans across a wide range of datasets, but also captures the geometry of the Wasserstein space correctly. Furthermore, we show that our network can be used as a state-of-the-art initialization for the Sinkhorn algorithm with speedups of up to $7.4\times$, significantly outperforming existing approaches.