3.8 Proceedings Paper

An Adaptive and Fast Convergent Approach to Differentially Private Deep Learning

Journal

Publisher

IEEE
DOI: 10.1109/infocom41043.2020.9155359

Keywords

crowdsourcing; information security and privacy; differential privacy; deep learning; adaptive; fast convergent

Funding

  1. National Key R&D Program of China [2018YFB1004704]
  2. National Natural Science Foundation of China [61902177, 61872082, 61472184]
  3. Natural Science Foundation of Jiangsu Province of China [BK20190298]
  4. Guangdong Leading Talent Program [2016LJ06D658]
  5. Jiangsu Innovation and Entrepreneurship (Shuangchuang) Program
  6. Nanyang Technological University (NTU) Startup Grant [M4082311.020]
  7. Alibaba-NTU Singapore Joint Research Institute (JRI) [M4062640.J4I]
  8. Singapore Ministry of Education Academic Research Fund Tier 1 [RG128/18, RG115/19]
  9. NTU-WASP Joint Project [M4082443.020]
  10. Singapore Ministry of Education Academic Research Fund Tier 2 [MOE2019-T2-1-176]

Ask authors/readers for more resources

With the advent of the era of big data, deep learning has become a prevalent building block in a variety of machine learning or data mining tasks, such as signal processing, network modeling and traffic analysis, to name a few. The massive user data crowdsourced plays a crucial role in the success of deep learning models. However, it has been shown that user data may be inferred from trained neural models and thereby exposed to potential adversaries, which raises information security and privacy concerns. To address this issue, recent studies leverage the technique of differential privacy to design private-preserving deep learning algorithms. Albeit successful at privacy protection, differential privacy degrades the performance of neural models. In this paper, we develop ADADP, an adaptive and fast convergent learning algorithm with a provable privacy guarantee. ADADP significantly reduces the privacy cost by improving the convergence speed with an adaptive learning rate and mitigates the negative effect of differential privacy upon the model accuracy by introducing adaptive noise. The performance of ADADP is evaluated on real-world datasets. Experiment results show that it outperforms state-of-the-art differentially private approaches in terms of both privacy cost and model accuracy.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available