4.7 Article

Privacy-Preserving Boosting in the Local Setting

期刊

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIFS.2021.3097822

关键词

Boosting; Privacy; Machine learning algorithms; Differential privacy; Decision trees; Data models; Prediction algorithms; Ensemble learning; boosting; local differential privacy

向作者/读者索取更多资源

This paper proposes a distributed privacy-preserving boosting algorithm that leverages Local Differential Privacy as a building block to ensure the privacy of participated data owners. Experiments show that the algorithm effectively boosts various classifiers while maintaining high utility.
In machine learning, boosting is one of the most popular methods that is designed to combine multiple base learners into a superior one. The well-known Boosted Decision Tree classifier has been widely adopted in data mining and pattern recognition. With the emerging challenge in privacy, the personal images, browsing history, and financial reports, which are held by individuals and entities are more likely to contain sensitive information. The privacy concern is intensified when the data leaves the hand of owners and is used for further mining. Such privacy issues demand that the machine learning algorithms should be privacy-aware. Recently, Local Differential Privacy has been proposed as an effective privacy protection approach, which allows data owners to perturb the data before any release. In this paper, we propose a distributed privacy-preserving boosting algorithm that can be applied to various types of classifiers. By adopting LDP as a building block, the proposed boosting algorithm leverages the aggregation of the perturbed data shares to build the base learner, which ensures that privacy is well preserved for the participated data owners. Our experiments demonstrate that the proposed algorithm effectively boosts various classifiers and the boosted classifiers maintain a high utility.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据