4.8 Article

A Comprehensive and Modularized Statistical Framework for Gradient Norm Equality in Deep Neural Networks

Publisher

IEEE COMPUTER SOC
DOI: 10.1109/TPAMI.2020.3010201

Keywords

Deep neural networks; free probability; gradient norm equality

Funding

  1. National Science Foundation [1725447, 1817037, 1730309]
  2. National Natural Science Foundation of china [61876215]
  3. Beijing Academy of Artificial Intelligence (BAAI)
  4. Institute for Guo Qiang, Tsinghua University
  5. Key Scientific Technological Innovation Research Project by the Ministry of Education
  6. Zhejiang laboratory
  7. Direct For Computer & Info Scie & Enginr [1725447, 1817037, 1730309] Funding Source: National Science Foundation
  8. Division Of Computer and Network Systems [1730309] Funding Source: National Science Foundation
  9. Division of Computing and Communication Foundations [1817037, 1725447] Funding Source: National Science Foundation

Ask authors/readers for more resources

This paper proposes a novel metric called Block Dynamical Isometry to measure the change of gradient norm in neural networks and presents a statistical framework based on free probability. Several existing methods are improved based on the analysis, and a new normalization technique is proposed. The conclusions and methods are validated by extensive experiments on multiple models.
The rapid development of deep neural networks (DNNs) in recent years can be attributed to the various techniques that address gradient explosion and vanishing. In order to understand the principle behind these techniques and develop new methods, plenty of metrics have been proposed to identify networks that are free of gradient explosion and vanishing. However, due to the diversity of network components and complex serial-parallel hybrid connections in modern DNNs, the evaluation of existing metrics usually requires strong assumptions, complex statistical analysis, or has limited application fields, which constraints their spread in the community. In this paper, inspired by the Gradient Norm Equality and dynamical isometry, we first propose a novel metric called Block Dynamical Isometry, which measures the change of gradient norm in individual blocks. Because our Block Dynamical Isometry is norm-based, its evaluation needs weaker assumptions compared with the original dynamical isometry. To mitigate challenging derivation, we propose a highly modularized statistical framework based on free probability. Our framework includes several key theorems to handle complex serial-parallel hybrid connections and a library to cover the diversity of network components. Besides, several sufficient conditions for prerequisites are provided. Powered by our metric and framework, we analyze extensive initialization, normalization, and network structures. We find that our Block Dynamical Isometry is a universal philosophy behind them. Then, we improve some existing methods based on our analysis, including an activation function selection strategy for initialization techniques, a new configuration for weight normalization, a depth-aware way to derive coefficients in SeLU, and initialization/weight normalization in DenseNet. Moreover, we propose a novel normalization technique named second moment normalization, which has 30 percent fewer computation overhead than batch normalization without accuracy loss and has better performance under micro batch size. Last but not least, our conclusions and methods are evidenced by extensive experiments on multiple models over CIFAR-10 and ImageNet.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available