论文标题

联合学习全球和本地模型的混合

Federated Learning of a Mixture of Global and Local Models

论文作者

Hanzely, Filip, Richtárik, Peter

论文摘要

我们为培训联合学习模型提出了一种新的优化公式。标准配方具有构建的经验风险最小化问题的形式,该问题是从所有参与设备中存储的私人数据中训练的单个全球模型。相比之下,我们的配方寻求这种传统的全球模型与本地模型之间的明确权衡,每个设备都可以从其自己的私人数据中学习,而无需任何通信。此外,我们开发了几种有效的SGD变体(有或没有部分参与,并且没有降低方差)来解决新的配方并证明通信复杂性保证。值得注意的是,我们的方法相似,但与联邦平均 /局​​部SGD不同,从而阐明了当地步骤在联合学习中的作用。特别是,我们是第一个i)表明,本地步骤可以改善与异质数据问题的沟通,ii)指出个性化会导致沟通复杂性降低。

We propose a new optimization formulation for training federated learning models. The standard formulation has the form of an empirical risk minimization problem constructed to find a single global model trained from the private data stored across all participating devices. In contrast, our formulation seeks an explicit trade-off between this traditional global model and the local models, which can be learned by each device from its own private data without any communication. Further, we develop several efficient variants of SGD (with and without partial participation and with and without variance reduction) for solving the new formulation and prove communication complexity guarantees. Notably, our methods are similar but not identical to federated averaging / local SGD, thus shedding some light on the role of local steps in federated learning. In particular, we are the first to i) show that local steps can improve communication for problems with heterogeneous data, and ii) point out that personalization yields reduced communication complexity.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源