4.6 Article

Parameter Conjugate Gradient with Secant Equation Based Elman Neural Network and its Convergence Analysis

相关参考文献

注意:仅列出部分参考文献,下载原文获取全部文献信息。
Article Operations Research & Management Science

Weak and strong convergence analysis of Elman neural networks via weight decay regularization

Li Zhou et al.

Summary: In this paper, a novel variant of the algorithm is proposed to improve the generalization performance for Elman neural networks. By controlling the weight growth and preventing over-fitting, rigorous theoretical analysis and experimental verification have been conducted.

OPTIMIZATION (2023)

Article Computer Science, Information Systems

Convergence analysis for sigma-pi-sigma neural network based on some relaxed conditions

Qinwei Fan et al.

Summary: The study focused on investigating the Takagi-Sugeno fuzzy-model-based networked control system under the fuzzy event-triggered H-infinity control scheme, introducing probability distribution based network transmission delay instead of conventional delay to improve compatibility with real-time simulations. By designing a Lyapunov function candidate based on membership function, the research aimed to enhance control system performance.

INFORMATION SCIENCES (2022)

Article Computer Science, Artificial Intelligence

A modified conjugate gradient-based Elman neural network

Long Li et al.

Summary: An efficient conjugate gradient method has been proposed in this paper to train Elman recurrent network, which outperforms traditional gradient descent and conjugate gradient methods, as well as evolutionary algorithm in terms of performance. Additionally, the convergence of the new algorithm has been proven.

COGNITIVE SYSTEMS RESEARCH (2021)

Article Computer Science, Information Systems

Deterministic convergence analysis via smoothing group Lasso regularization and adaptive momentum for Sigma-Pi-Sigma neural network

Qian Kang et al.

Summary: The paper proposes a sparse and accelerated method for Sigma-Pi-Sigma neural network training, utilizing smoothing group lasso regularization and adaptive momentum to efficiently sparsity the network structure and speed up learning convergence. Theoretical analysis and numerical experiments demonstrate the effectiveness of the new algorithm.

INFORMATION SCIENCES (2021)

Article Computer Science, Information Systems

Convergence of a Gradient-Based Learning Algorithm With Penalty for Ridge Polynomial Neural Networks

Qinwei Fan et al.

Summary: This study introduces a regularization model with Group Lasso penalty to enhance convergence speed and generalization ability of the network. The use of a smoothing function to approximate the Group Lasso penalty overcomes numerical oscillation and convergence analysis challenges. The efficiency of the proposed algorithm is demonstrated through numerical experiments and compared with various regularizers, supported by theoretical analysis.

IEEE ACCESS (2021)

Article Mathematics, Applied

Improved Fletcher-Reeves and Dai-Yuan conjugate gradient methods with the strong Wolfe line search

Xianzhen Jiang et al.

JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS (2019)

Article Mathematics, Applied

A modified conjugate gradient method for monotone nonlinear equations with convex constraints

Aliyu Muhammed Awwal et al.

APPLIED NUMERICAL MATHEMATICS (2019)

Article Mathematics, Interdisciplinary Applications

PARAMETER PREDICTION OF HYDRAULIC FRACTURE FOR TIGHT RESERVOIR BASED ON MICRO-SEISMIC AND HISTORY MATCHING

Kai Zhang et al.

FRACTALS-COMPLEX GEOMETRY PATTERNS AND SCALING IN NATURE AND SOCIETY (2018)

Article Computer Science, Artificial Intelligence

A modified Elman neural network with a new learning rate scheme

Guanghua Ren et al.

NEUROCOMPUTING (2018)

Article Engineering, Industrial

An efficient-robust structural reliability method by adaptive finite-step length based on Armijo line search

Behrooz Keshtegar et al.

RELIABILITY ENGINEERING & SYSTEM SAFETY (2018)

Proceedings Paper Engineering, Electrical & Electronic

Analysis of Artificial Neural Network Backpropagation Using Conjugate Gradient Fletcher Reeves In The Predicting Process

Anjar Wanto et al.

INTERNATIONAL CONFERENCE ON INFORMATION AND COMMUNICATION TECHNOLOGY (ICONICT) (2017)

Article Mathematics, Applied

Some modified conjugate gradient methods for unconstrained optimization

Xuewu Du et al.

JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS (2016)

Article Mathematics, Applied

A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches

Mohd Rivaie et al.

APPLIED MATHEMATICS AND COMPUTATION (2015)

Article Mathematics, Applied

A new conjugate gradient algorithm for training neural networks based on a modified secant equation

Ioannis E. Livieris et al.

APPLIED MATHEMATICS AND COMPUTATION (2013)

Article Mathematics

Two Modified Hybrid Conjugate Gradient Methods Based on a Hybrid Secant Equation

Saman Babaie-Kafaki et al.

MATHEMATICAL MODELLING AND ANALYSIS (2013)

Article Mathematics, Applied

A NONLINEAR CONJUGATE GRADIENT ALGORITHM WITH AN OPTIMAL PROPERTY AND AN IMPROVED WOLFE LINE SEARCH

Yu-Hong Dai et al.

SIAM JOURNAL ON OPTIMIZATION (2013)

Article Mathematics, Applied

A new class of nonlinear conjugate gradient coefficients with global convergence properties

Mohd Rivaie et al.

APPLIED MATHEMATICS AND COMPUTATION (2012)

Proceedings Paper Physics, Applied

A New Class of Conjugate Gradient Coefficient with Global Convergence Properties

Mohd Rivaie et al.

INTERNATIONAL CONFERENCE ON FUNDAMENTAL AND APPLIED SCIENCES 2012 (ICFAS2012) (2012)

Article Computer Science, Artificial Intelligence

Deterministic convergence of conjugate gradient method for feedforward neural networks

Jian Wang et al.

NEUROCOMPUTING (2011)

Article Mathematics, Applied

Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations

JZ Zhang et al.

JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS (2001)