期刊
NEUROCOMPUTING
卷 272, 期 -, 页码 204-212出版社
ELSEVIER
DOI: 10.1016/j.neucom.2017.06.070
关键词
Convolutional neural networks; Activation function learning; Adaptive activation; Hierarchical activation
资金
- Research Grants Council of the Hong Kong Special Administrative Region, China [CityU 11300715]
- City University of Hong Kong [7004674]
Activation functions play important roles in deep convolutional neural networks. This work focuses on learning activation functions via combining basic activation functions in a data-driven way. We explore three strategies to learn the activation functions, and allow the activation operation to be adaptive to inputs. We firstly explore two strategies to linearly and nonlinearly combine basic activation functions, respectively. Then we further investigate a strategy that basic activation functions are combined in a way of a hierarchical integration. Experiments demonstrate that the proposed activation functions lead to better performances than ReLU and its variants on benchmarks with various scales. (C) 2017 Elsevier B.V. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据