论文标题

联合学习的双重方法

A dual approach for federated learning

论文作者

Fan, Zhenan, Fang, Huang, Friedlander, Michael P.

论文摘要

我们从双重角度研究了联邦优化问题,并提出了一种称为联合双坐标下降(FedDCD)的新算法,该算法基于Necora等人开发的一种坐标下降方法[优化理论和应用杂志,2017年]。此外,我们通过不精确的梯度甲壳和Nesterov的加速度增强了FedDCD方法。从理论上讲,我们所提出的方法比在某些情况下的最先进的联邦优化算法获得了更好的收敛速率。现实数据集的数值实验支持我们的分析。

We study the federated optimization problem from a dual perspective and propose a new algorithm termed federated dual coordinate descent (FedDCD), which is based on a type of coordinate descent method developed by Necora et al.[Journal of Optimization Theory and Applications, 2017]. Additionally, we enhance the FedDCD method with inexact gradient oracles and Nesterov's acceleration. We demonstrate theoretically that our proposed approach achieves better convergence rates than the state-of-the-art primal federated optimization algorithms under certain situations. Numerical experiments on real-world datasets support our analysis.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源