4.5 Article

An Efficient Asymmetric Nonlinear Activation Function for Deep Neural Networks

期刊

SYMMETRY-BASEL
卷 14, 期 5, 页码 -

出版社

MDPI
DOI: 10.3390/sym14051027

关键词

the neural network; the activation function; asymmetry; self-regular; non-monotonic; backbone network

资金

  1. University of Nottingham NRG grant [I03211200008]
  2. National Natural Science Foundation of China [72071116]
  3. Ningbo Municipal Bureau Science and Technology [2019B10026]

向作者/读者索取更多资源

The activation function is crucial to neural network performance. This paper proposes an Efficient Asymmetric Nonlinear Activation Function (EANAF) that requires less computational effort and has self-regularization, asymmetry, and non-monotonicity. Experimental results show the superior performance of EANAF in object detection.
As a key step to endow the neural network with nonlinear factors, the activation function is crucial to the performance of the network. This paper proposes an Efficient Asymmetric Nonlinear Activation Function (EANAF) for deep neural networks. Compared with existing activation functions, the proposed EANAF requires less computational effort, and it is self-regularized, asymmetric and non-monotonic. These desired characteristics facilitate the outstanding performance of the proposed EANAF. To demonstrate the effectiveness of this function in the field of object detection, the proposed activation function is compared with several state-of-the-art activation functions on the typical backbone networks such as ResNet and DSPDarkNet. The experimental results demonstrate the superior performance of the proposed EANAF.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据