4.8 Article

A novel system architecture for secure authentication and data sharing in cloud enabled Big Data Environment

出版社

ELSEVIER
DOI: 10.1016/j.jksuci.2020.05.005

关键词

Big data outsourcing; Big data sharing; Big data management; SALSA encryption with MapReduce; Fractal index tree; SHA-3

向作者/读者索取更多资源

This paper presents a solution for addressing the challenges of big data security in cloud computing. It introduces a novel system architecture called SADS-Cloud, which includes three processes: big data outsourcing, big data sharing, and big data management. The solution utilizes various encryption algorithms and data organization methods to ensure the security of big data in the cloud environment.
With the rapid growth of data sources, Big data security in Cloud is a big challenge. Different issues have ascended in the area of Big data security such as infrastructure security, data privacy, data management and data integrity. Currently, Big data processing, analytics and storage is secured using cryptography algorithms, which are not appropriate for Big data protection over Cloud. In this paper, we present a solution for addressing the main issues in Big data security over Cloud. We propose a novel system architecture called the Secure Authentication and Data Sharing in Cloud (SADS-Cloud). There are three processes involved in this paper including (i). Big Data Outsourcing, (ii). Big Data Sharing and (iii). Big Data Management. In Big data outsourcing, the data owners are registered to a Trust Center using SHA-3 hashing algorithm. The MapReduce model is used to split the input file into fixed-size of blocks of data and SALSA20 encryption algorithm is applied over each block. In Big data sharing, data users participate in a secure file retrieval. For this purpose, user's credentials (ID, password, secure ID, and current times-tamp, email id) are hashed and compared with that stored in a database. In Big data management, there are three important processes implemented to organize data. They are as follows: Compression using Lemperl Ziv Markow Algorithm (LZMA), Clustering using Density-based Clustering of Applications with Noise (DBSCAN), and Indexing using Fractal Index Tree. The proposed scheme for these processes are implemented using Java Programming and performance tested for the following metrics: Information Loss, Compression Ratio, Throughput, Encryption Time and Decryption Time. (C) 2020 The Authors. Published by Elsevier B.V. on behalf of King Saud University.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据