论文标题

通过高维高斯机构对回归模型的缓解查询液液参数重复攻击

Mitigating Query-Flooding Parameter Duplication Attack on Regression Models with High-Dimensional Gaussian Mechanism

论文作者

Li, Xiaoguang, Li, Hui, Yan, Haonan, Cheng, Zelei, Sun, Wenhai, Zhu, Hui

论文摘要

通过机器学习算法启用的公共智能服务很容易受到模型提取攻击的影响,这些攻击可以通过公共查询窃取学习模型的机密信息。差异隐私(DP)被认为是减轻此攻击的有前途的技术。但是,我们发现当回归模型受到当前DP解决方案保护时,漏洞持续存在。我们表明,对手可以启动查询液化参数重复(QPD)攻击,以通过重复查询来推断模型信息。 为了防止对逻辑和线性回归模型的QPD攻击,我们提出了一种新颖的高维高斯(HDG)机制,以防止未经授权的信息披露而不会中断预期的服务。与先前的工作相反,提出的HDG机制将动态生成不同查询的隐私预算和随机噪声及其结果,以增强混淆。此外,HDG首次启用最佳隐私预算分配,该分配自动确定每个维度上每个用户呈现的隐私级别要添加的最小噪声量。我们使用现实世界数据集对HDG的性能进行了全面评估,并表明HDG有效地减轻了QPD攻击,同时满足了隐私要求。我们还准备将相关代码开放给社区以进行进一步研究。

Public intelligent services enabled by machine learning algorithms are vulnerable to model extraction attacks that can steal confidential information of the learning models through public queries. Differential privacy (DP) has been considered a promising technique to mitigate this attack. However, we find that the vulnerability persists when regression models are being protected by current DP solutions. We show that the adversary can launch a query-flooding parameter duplication (QPD) attack to infer the model information by repeated queries. To defend against the QPD attack on logistic and linear regression models, we propose a novel High-Dimensional Gaussian (HDG) mechanism to prevent unauthorized information disclosure without interrupting the intended services. In contrast to prior work, the proposed HDG mechanism will dynamically generate the privacy budget and random noise for different queries and their results to enhance the obfuscation. Besides, for the first time, HDG enables an optimal privacy budget allocation that automatically determines the minimum amount of noise to be added per user-desired privacy level on each dimension. We comprehensively evaluate the performance of HDG using real-world datasets and shows that HDG effectively mitigates the QPD attack while satisfying the privacy requirements. We also prepare to open-source the relevant codes to the community for further research.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源