4.6 Article

Deep neural networks with Elastic Rectified Linear Units for object recognition

期刊

NEUROCOMPUTING
卷 275, 期 -, 页码 1132-1139

出版社

ELSEVIER
DOI: 10.1016/j.neucom.2017.09.056

关键词

Deep neural networks; Elastic Rectified Linear Unit (EReLU); Elastic Parametric Rectified Linear Unit (EPReLU); Non-saturating nonlinear activation function

向作者/读者索取更多资源

Rectified Linear Unit (ReLU) is crucial to the recent success of deep neural networks (DNNs). In this paper, we propose a novel Elastic Rectified Linear Unit (EReLU) that focuses on processing the positive part of input. Unlike previous variants of ReLU that typically adopt linear or piecewise linear functions to represent the positive part, EReLU is characterized by that each positive value scales within a moderate range like a spring during training stage. On test time, EReLU becomes standard ReLU. EReLU improves model fitting with no extra parameters and little overfitting risk. Furthermore, we propose Elastic Parametric Rectified Linear Unit (EPReLU) by taking advantage of EReLU and parametric ReLU (PReLU). EPReLU is able to further improve the performance of networks. In addition, we present a new training strategy to train DNNs with EPReLU. Experiments on four benchmarks including CIFAR10, CIFAR10, SVHN and ImageNet 2012 demonstrate the effectiveness of both EReLU and EPReLU. (C) 2017 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据