4.5 Article

Improved Recurrent Neural Networks for Text Classification and Dynamic Sylvester Equation Solving

Journal

NEURAL PROCESSING LETTERS
Volume -, Issue -, Pages -

Publisher

SPRINGER
DOI: 10.1007/s11063-023-11176-6

Keywords

Text classification; Novel activation functions; Recurrent neural networks; Dynamic Sylvester equation; Dynamic matrix inversion; Robot manipulator

Ask authors/readers for more resources

This study proposes two novel activation functions (NAF) to improve the performance of recurrent neural network (RNN) models in text classification and dynamic problems solving. The first NAF (NAF(1)) is applied to various RNN models for text classification, achieving higher accuracy compared to traditional activation functions. Additionally, the second NAF (NAF(2)) is used to construct an improved fixed-time convergent RNN model (IFTCRNN) for solving time-varying problems, demonstrating fixed-time convergence and strong robustness to noises.
The solution of the text classification and time-varying problems are two basic practical problems frequently encountered in the fields of science and engineering, and most of the text classification and dynamic problems solving are realized by recurrent neural networks (RNN), therefore, the improvement on the convergence and robustness of the RNN models becomes increasingly important. Based on this fact, two novel activation functions (NAF) are proposed to improve the performance of each RNN formula for text classification, dynamic problems solving and dynamic matrix inversion in this work. Firstly, the first NAF (NAF(1)) is applied to the two-layer simple RNN model, long short-term memory RNN model and gated recurrent unit RNN model for text classification. Comparing with the above three RNN models activated by reported activation functions (rectified linear unit (ReLU) function, leak relu (LReLU), exponential linear unit (ELU), scaled ELU], the NAF(1)-activated RNN models achieve higher accuracy in text classification. In addition, based on the second NAF (NAF(2)), an improved fixed-time convergent recurrent neural network (IFTCRNN) model for time-varying problems solving is constructed. The NAF(2)-based IFTCRNN model achieves fixed-time convergence and strong robustness to noises in time-varying Sylvester matrix equation solving, dynamic matrix inversion and robot manipulator trajectory tracking.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available