4.6 Article

Improving deep neural networks with multi-layer maxout networks and a novel initialization method

期刊

NEUROCOMPUTING
卷 278, 期 -, 页码 34-40

出版社

ELSEVIER
DOI: 10.1016/j.neucom.2017.05.103

关键词

Deep learning; Convolutional neural networks; Activation function; Image classification; Initialization

资金

  1. Chinese National Natural Science Foundation of China [61372169, 61532018, 61471049]

向作者/读者索取更多资源

For the purpose of enhancing the discriminability of convolutional neural networks (CNNs) and facilitating the optimization, we investigate the activation function for a neural network and the corresponding initialization method in this paper. Firstly, a trainable activation function with a multi-layer structure (named Multi-layer Maxout Network, MMN) is proposed. MMN is a multi-layer structured maxout, inheriting advantages of both a non-saturated activation function and a trainable activation function approximator. Secondly, we derive a robust initialization method specifically for the MMN activation with a theoretical proof, which works for the maxout activation as well. Our novel initialization method could reduce internal covariate shift when signals are propagated through layers and solve the so called exploding/vanishing gradientproblem, which leads a more efficient training procedure of deep neural networks. Experimental results show that our proposed model yields better performance on three image classification benchmark datasets (CIFAR-10, CIFAR-100 and ImageNet) than quite a few state-of-the-art methods and our novel initialization method improves performance further. Furthermore, the influence of MMN in different hidden layers is analyzed, and a trade-offscheme between the accuracy and computing resources is given. (C) 2017 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据