4.7 Article

Linearized sigmoidal activation: A novel activation function with tractable non-linear characteristics to boost representation capability

期刊

EXPERT SYSTEMS WITH APPLICATIONS
卷 120, 期 -, 页码 346-356

出版社

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.eswa.2018.11.042

关键词

Deep learning; Activation function; Non-linear activation; Adaptive activation; Convolutional neural networks

向作者/读者索取更多资源

Activation functions play a crucial role in discriminative capabilities of the deep neural networks. They are also one of the main reasons for revival of the neural networks. Although, recent activation functions provide solution to vanishing and exploding gradient problems, but they do not have sufficient capacity to model non-linear data. This paper proposes a novel activation function, which imparts neural networks with capability to model non-linear dependencies in data. Behavior of proposed activation function remains non-saturating even with non-linear structure. The activation function provides distinct activation behaviors to different data range segments. The paper presents two different variants of proposed function. The first, linear sigmoidal activation function is a fixed structure activation function with function coefficients defined at the start of model design. Whereas second, adaptive linear sigmoidal activation function is a trainable function which can adapt itself according to the complexity of the given data. Both of the proposed models are tested against the state of the art activation functions on benchmark datasets (CIFAR-10, MNIST, SVHN, FER-2013). The proposed activation function is able to outperforms every known activation functions in all the tests. (C) 2018 Elsevier Ltd. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据