4.7 Article

Differential privacy model for blockchain based smart home architecture

Publisher

ELSEVIER
DOI: 10.1016/j.future.2023.08.010

Keywords

Smart home; Blockchain; Differential privacy; Membership inference attack; Internet of Things; Edge computing

Ask authors/readers for more resources

Secure and private communications in IoT for smart home systems are challenging. This paper proposes a privacy-preserving data aggregation mechanism using differential privacy and blockchain to protect sensitive personal information. The approach is evaluated using public datasets and shows that the differential private models can provide privacy protection while sacrificing some model utility.
Secure and private communications using the Internet of Things (IoT) pose several challenges for smart home systems. In particular, data collected from IoT devices comprise sensitive personal information such as biomedical data, financial data, and location and activity data. Recent research looks into the use of blockchain in smart home systems, protecting the privacy of the data in use. Such solutions need to address the issue of privacy using a formal and mathematical model for data privacy due to the vulnerability associated with privacy-preserving blockchain networks. In the present paper, our approach aims to provide a privacy-preserving data aggregation mechanism in the context of Smart Homes that agree to contribute their data to a cloud server using machine learning to improve services for home users. We propose the use of differential privacy, a powerful concept in privacy preserving schemes to provide formal assurances about how much information is leaked using a privacy budget. The main purpose of using such a privacy-preserving scheme is to limit what can be inferred about individual training data from the model. Our techniques use a R ' enyi differential privacy (RDP) machine learning scheme and are based on a variant of the stochastic gradient descent function. The performance of our proposed framework is evaluated using three public datasets: UNSW-NB15, NSL-KDD, and ToN-IoT datasets. Our findings show that differential private models can provide privacy protection against attackers by sacrificing a substantial amount of model utility. Therefore, we propose an empirical value of epsilon, that can optimally balances utility and privacy for the current smart home scenario datasets.(c) 2023 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available