4.6 Article

Differentially private ensemble learning for classification

Journal

NEUROCOMPUTING
Volume 430, Issue -, Pages 34-46

Publisher

ELSEVIER
DOI: 10.1016/j.neucom.2020.12.051

Keywords

Ensemble learning; Differential privacy; Bagging; Classifier selection; Classification

Funding

  1. National Natural Science Foundation of China [61672176, 61763003, 61941201]
  2. Research Fund of Guangxi Key Lab of Multi-source Information Mining Security [19-A-02-01]
  3. Innovation Project of Guangxi Graduate Education [XYCSZ2020072]
  4. Guangxi 1000-Plan of Training Middle-aged/Young Teachers in Higher Education Institutions, Guangxi Bagui Scholar Teams for Innovation and Research Project, Guangxi Talent Highland Project of Big Data Intelligence and Application, Guangxi Collaborative In

Ask authors/readers for more resources

This study applies differential privacy techniques to ensemble learning, proposing an algorithm that achieves a balance between privacy protection and prediction accuracy in classification.
Training machine learning models requires large amounts of data, which may contain personal sensitive information. Machine learning based on privacy protection has become a research hotspot. In this paper, differential privacy is applied to the ensemble learning, a branch of machine learning, to prevent privacy leakage in the classification process. We propose a differentially private ensemble learning algorithm for classification, which achieves privacy protection while ensures prediction accuracy. Firstly, we adopt the Bag of Little Bootstrap technique and the Jaccard similarity coefficient to generate a set of training data sets, and construct corresponding differentially private base classifiers by adding a carefully chosen amount of perturbation noise with a privacy budget allocation strategy. Furthermore, to reduce the impact of perturbation noise on the accuracy of prediction, an effective ensemble algorithm is proposed. Specifically, the base classifiers are selected based on some criterion functions, and the corresponding weights are assigned simultaneously. Then, the final result of the classification is obtained by a weighted voting scheme. Experiments are executed on 9 real data sets from the UCI Machine Learning Repository to demonstrate that our differentially private ensemble classification algorithm achieves a better trade-off in terms of privacy protection and prediction accuracy. (c) 2020 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available