论文标题

非平滑迭代算法的自动差异

Automatic differentiation of nonsmooth iterative algorithms

论文作者

Bolte, Jérôme, Pauwels, Edouard, Vaiter, Samuel

论文摘要

现在,沿算法的分化,即衍生物的背包传播,通常用于区分可区分编程中的迭代求解器。对于许多平稳的问题,渐近学已被充分理解,但是几乎不考虑无形的情况。是否有一个非平滑piggyback自动分化(AD)的限制对象?它是否具有变分的含义,并且可以在机器学习中有效使用吗?与经典衍生物有联系吗?在保守衍生品的框架内,在适当的非义务条件下解决了所有这些问题,该衍生品被证明可用于理解非平滑广告。对于非平滑的背包迭代,我们将吸引人的非平滑背式背迭代术语表征为一个设置值的固定点,它仍然保守框架。这会带来各种后果,尤其是几乎所有地方的经典衍生物融合。我们的结果在参数凸的优化问题上进行了说明,而前后回扣,道格拉斯 - 拉赫福德以及乘数算法的交替方向以及重球方法。

Differentiation along algorithms, i.e., piggyback propagation of derivatives, is now routinely used to differentiate iterative solvers in differentiable programming. Asymptotics is well understood for many smooth problems but the nondifferentiable case is hardly considered. Is there a limiting object for nonsmooth piggyback automatic differentiation (AD)? Does it have any variational meaning and can it be used effectively in machine learning? Is there a connection with classical derivative? All these questions are addressed under appropriate nonexpansivity conditions in the framework of conservative derivatives which has proved useful in understanding nonsmooth AD. For nonsmooth piggyback iterations, we characterize the attractor set of nonsmooth piggyback iterations as a set-valued fixed point which remains in the conservative framework. This has various consequences and in particular almost everywhere convergence of classical derivatives. Our results are illustrated on parametric convex optimization problems with forward-backward, Douglas-Rachford and Alternating Direction of Multiplier algorithms as well as the Heavy-Ball method.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源