论文标题

在异质移动边缘学习中共同优化数据集大小和本地更新

Jointly Optimizing Dataset Size and Local Updates in Heterogeneous Mobile Edge Learning

论文作者

Mohammad, Umair, Sorour, Sameh, Hefeida, Mohamed

论文摘要

本文提议最大程度地提高分布式机器学习(ML)模型的准确性,该模型是通过资源受限的无线边缘连接的学习者训练的。我们共同优化本地/全球更新的数量和任务规模分配的数量,以最大程度地减少损失,同时考虑到每个学习者的异质沟通和计算功能。通过利用在任何给定迭代处的训练损失与理论上最佳损失之间的差异的界限,我们根据本地更新的数量得出了目标函数的表达式。求解所得的凸面程序以获取最佳的本地更新数量,该更新用于获得每个学习者的总更新和批次大小。通过将其性能与异质性不知道的(HU)方法进行比较,表现出了异质性意识(HA)的拟议解决方案的优点。

This paper proposes to maximize the accuracy of a distributed machine learning (ML) model trained on learners connected via the resource-constrained wireless edge. We jointly optimize the number of local/global updates and the task size allocation to minimize the loss while taking into account heterogeneous communication and computation capabilities of each learner. By leveraging existing bounds on the difference between the training loss at any given iteration and the theoretically optimal loss, we derive an expression for the objective function in terms of the number of local updates. The resulting convex program is solved to obtain the optimal number of local updates which is used to obtain the total updates and batch sizes for each learner. The merits of the proposed solution, which is heterogeneity aware (HA), are exhibited by comparing its performance to the heterogeneity unaware (HU) approach.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源