4.8 Article

Orthogonal Deep Neural Networks

出版社

IEEE COMPUTER SOC
DOI: 10.1109/TPAMI.2019.2948352

关键词

Training; Robustness; Jacobian matrices; Task analysis; Neural networks; Optimization; Deep learning; Deep neural networks; generalization error; robustness; spectral regularization; image classification

资金

  1. National Natural Science Foundation of China [61771201]
  2. Program for Guangdong Introducing Innovative and Enterpreneurial Teams [2017ZT07X183]
  3. Australian Research Council [DP180103424, DE190101473, FL-170100117]
  4. Australian Research Council [DE190101473] Funding Source: Australian Research Council

向作者/读者索取更多资源

This paper introduces the algorithms of Orthogonal Deep Neural Networks (OrthDNNs) to improve generalization performance by connecting with recent interest in spectrally regularized deep learning methods. Theoretical analyses and experiments demonstrate that OrthDNNs can achieve local isometric properties on practical data distributions, leading to better optimization of network weights. Proposed algorithms, including strict and approximate OrthDNNs, along with the SVB and BBN methods, show effective and efficient performance in benchmark image classification tasks.
In this paper, we introduce the algorithms of Orthogonal Deep Neural Networks (OrthDNNs) to connect with recent interest of spectrally regularized deep learning methods. OrthDNNs are theoretically motivated by generalization analysis of modern DNNs, with the aim to find solution properties of network weights that guarantee better generalization. To this end, we first prove that DNNs are of local isometry on data distributions of practical interest; by using a new covering of the sample space and introducing the local isometry property of DNNs into generalization analysis, we establish a new generalization error bound that is both scale- and range-sensitive to singular value spectrum of each of networks' weight matrices. We prove that the optimal bound w.r.t. the degree of isometry is attained when each weight matrix has a spectrum of equal singular values, among which orthogonal weight matrix or a non-square one with orthonormal rows or columns is the most straightforward choice, suggesting the algorithms of OrthDNNs. We present both algorithms of strict and approximate OrthDNNs, and for the later ones we propose a simple yet effective algorithm called Singular Value Bounding (SVB), which performs as well as strict OrthDNNs, but at a much lower computational cost. We also propose Bounded Batch Normalization (BBN) to make compatible use of batch normalization with OrthDNNs. We conduct extensive comparative studies by using modern architectures on benchmark image classification. Experiments show the efficacy of OrthDNNs.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据