4.7 Article

Singular Values for ReLU Layers

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TNNLS.2019.2945113

关键词

Neural networks; Tools; Harmonic analysis; Learning systems; Task analysis; Measurement; Gaussian mean width; n-width; neural networks; rectified linear unit (ReLU); singular values

资金

  1. Deutsche Forschungsgemeinschaft (DFG) [GRK 2224 / pi3]

向作者/读者索取更多资源

Despite their prevalence in neural networks, we still lack a thorough theoretical characterization of rectified linear unit (ReLU) layers. This article aims to further our understanding of ReLU layers by studying how the activation function ReLU interacts with the linear component of the layer and what role this interaction plays in the success of the neural network in achieving its intended task. To this end, we introduce two new tools: ReLU singular values of operators and the Gaussian mean width of operators. By presenting, on the one hand, theoretical justifications, results, and interpretations of these two concepts and, on the other hand, numerical experiments and results of the ReLU singular values and the Gaussian mean width being applied to trained neural networks, we hope to give a comprehensive, singular-value-centric view of ReLU layers. We find that ReLU singular values and the Gaussian mean width do not only enable theoretical insights but also provide one with metrics that seem promising for practical applications. In particular, these measures can be used to distinguish correctly and incorrectly classified data as it traverses the network. We conclude by introducing two tools based on our findings: double layers and harmonic pruning.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据