4.7 Article

Deep Neural Networks with Random Gaussian Weights: A Universal Classification Strategy?

Journal

IEEE TRANSACTIONS ON SIGNAL PROCESSING
Volume 64, Issue 13, Pages 3444-3457

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TSP.2016.2546221

Keywords

Artificial neural networks; computation theory; deep learning; learning systems

Funding

  1. NSF
  2. ONR
  3. NGA
  4. NSSEFF
  5. ARO
  6. ERC StG [335491]
  7. Division of Computing and Communication Foundations
  8. Direct For Computer & Info Scie & Enginr [1318168] Funding Source: National Science Foundation
  9. European Research Council (ERC) [335491] Funding Source: European Research Council (ERC)

Ask authors/readers for more resources

Three important properties of a classification machinery are i) the system preserves the core information of the input data; ii) the training examples convey information about unseen data; and iii) the system is able to treat differently points from different classes. In this paper, we show that these fundamental properties are satisfied by the architecture of deep neural networks. We formally prove that these networks with random Gaussian weights perform a distance-preserving embedding of the data, with a special treatment for in-class and out-of-class data. Similar points at the input of the network are likely to have a similar output. The theoretical analysis of deep networks here presented exploits tools used in the compressed sensing and dictionary learning literature, thereby making a formal connection between these important topics. The derived results allow drawing conclusions on the metric learning properties of the network and their relation to its structure, as well as providing bounds on the required size of the training set such that the training examples would represent faithfully the unseen data. The results are validated with state-of-the-art trained networks.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available