4.1 Article

Role of Layers and Neurons in Deep Learning With the Rectified Linear Unit

期刊

CUREUS JOURNAL OF MEDICAL SCIENCE
卷 13, 期 10, 页码 -

出版社

SPRINGERNATURE
DOI: 10.7759/cureus.18866

关键词

rectified linear uniit (relu); role of neurons; role of layers; three types of irises; canonical correlation analysis; curved surface discrimination; deep learning

向作者/读者索取更多资源

This paper focuses on the theoretical analysis of deep learning using the rectified linear unit (ReLU) activation function. Increasing the number of layers improves the approximation accuracy of the curved surface, while increasing the number of neurons cannot improve the results obtained. These results illustrate the functions of layers and neurons in deep learning with ReLU.
Deep learning is used to classify data into several groups based on nonlinear curved surfaces. In this paper, we focus on the theoretical analysis of deep learning using the rectified linear unit (ReLU) activation function. Because layers approximate a nonlinear curved surface, increasing the number of layers improves the approximation accuracy of the curved surface. While neurons perform a layer-by-layer approximation of the most appropriate hyperplanes, increasing their number cannot improve the results obtained via canonical correlation analysis (CCA). These results illustrate the functions of layers and neurons in deep learning with ReLU.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.1
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据