论文标题

完全分散的数据集对私有线性回归

Differentially Private Linear Regression over Fully Decentralized Datasets

论文作者

Liu, Yang, Zhang, Xiong, Qin, Shuqi, Lei, Xiaoping

论文摘要

本文介绍了一种以分散方式进行线性回归学习的差异私有算法。根据该算法,理论上是派生的隐私预算,除了解决方案错误被证明是由$ o(1/t)$ descent步骤尺寸和$ O(\ exp(t^{1-e})$的$ O(1/t)$(t^{ - e})$下降步骤大小的$ O(1/T)$(\ exp(t^{1-e})$)的界限。

This paper presents a differentially private algorithm for linear regression learning in a decentralized fashion. Under this algorithm, privacy budget is theoretically derived, in addition to that the solution error is shown to be bounded by $O(t)$ for $O(1/t)$ descent step size and $O(\exp(t^{1-e}))$ for $O(t^{-e})$ descent step size.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源