论文标题

合并在协作任务中的神经网络权重以隐私的学习

Combined Learning of Neural Network Weights for Privacy in Collaborative Tasks

论文作者

Ioste, Aline R., Durham, Alan M., Finger, Marcelo

论文摘要

我们介绍了COLN,对神经网络权重的结合学习,这是一种新的方法,可将机器学习模型与敏感数据相结合而没有数据共享。使用COLN,本地主机使用相同的神经网络体系结构和基本参数仅使用本地可用数据来训练模型。然后将经过本地训练的模型提交给组合剂,该组合产生组合模型。新模型的参数可以发送回主机,然后可以用作新训练迭代的初始参数。 COLN能够将几种相同类型的分布式神经网络组合在一起,但不限于任何单个神经结构。在本文中,我们详细介绍了组合算法和目前的实验,并通过馈送,卷积和复发性的神经网络体系结构进行了介绍,表明COLN组合模型近似于使用本地数据集的组合训练假设理想的集中模型的性能。 COLN可以按照医疗领域的要求有助于确保合作研究,而隐私问题排除了数据共享,但在此处,本地数据需求信息的局限性来自较大的数据集。

We introduce CoLN, Combined Learning of Neural network weights, a novel method to securely combine Machine Learning models over sensitive data with no sharing of data. With CoLN, local hosts use the same Neural Network architecture and base parameters to train a model using only locally available data. Locally trained models are then submitted to a combining agent, which produces a combined model. The new model's parameters can be sent back to hosts, and can then be used as initial parameters for a new training iteration. CoLN is capable of combining several distributed neural networks of the same kind but is not restricted to any single neural architecture. In this paper we detail the combination algorithm and present experiments with feed-forward, convolutional, and recurrent Neural Network architectures, showing that the CoLN combined model approximates the performance of a hypothetical ideal centralized model, trained using the combination of the local datasets. CoLN can contribute to secure collaborative research, as required in the medical area, where privacy issues preclude data sharing, but where the limitations of local data demand information derived from larger datasets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源