4.5 Article

An Efficient Asymmetric Nonlinear Activation Function for Deep Neural Networks

相关参考文献

注意:仅列出部分参考文献,下载原文获取全部文献信息。
Article Computer Science, Artificial Intelligence

Discovering Parametric Activation Functions

Garrett Bingham et al.

Summary: This paper proposes a technique for customizing activation functions automatically, resulting in reliable improvements in performance. It discovers general activation functions and specialized functions for different neural network architectures, consistently improving accuracy over ReLU and other activation functions.

NEURAL NETWORKS (2022)

Article Engineering, Electrical & Electronic

Parametric rectified nonlinear unit (PRenu) for convolution neural networks

Ilyas El Jaafari et al.

Summary: The proposed parametric rectified nonlinear function unit (PRenu) is similar to Relu but differentiates by providing a non-linear transformation for positive values and improves CNN convergence speed and accuracy.

SIGNAL IMAGE AND VIDEO PROCESSING (2021)

Review Computer Science, Artificial Intelligence

A survey on modern trainable activation functions

Andrea Apicella et al.

Summary: In recent years, there has been a renewed interest in trainable activation functions, which can be trained during the learning process to improve neural network performance. Various models of trainable activation functions have been proposed in the literature, many of which are equivalent to adding neuron layers with fixed activation functions and simple local rules.

NEURAL NETWORKS (2021)

Article Computer Science, Artificial Intelligence

Focal Loss for Dense Object Detection

Tsung-Yi Lin et al.

IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE (2020)

Proceedings Paper Energy & Fuels

Space Charge Analysis of Polyethylene with Chemical Defects Based on Density Function Theory

Tao Lin et al.

2018 IEEE INTERNATIONAL CONFERENCE ON HIGH VOLTAGE ENGINEERING AND APPLICATION (ICHVE) (2018)