3.8 Article

Recurrent least squares support vector machines

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/81.855471

关键词

double scroll; radial basis functions; recurrent neural networks; support vector machines

向作者/读者索取更多资源

The method of support vector machines (SVM's) has been developed for solving classification and static function approximation problems. In this paper we introduce SVM's within the context of recurrent neural networks. Instead of Vapnik's epsilon insensitive loss function, we consider a least squares version related to a cost function with equality constraints for a recurrent network, Essential features of SVM's remain, such as Mercer's condition and the fact that the output weights are a Lagrange multiplier weighted sum of the data points, The solution to recurrent least squares (LS-SVM's) is characterized by a set of nonlinear equations. Due to its high computational complexity, we focus on a limited case of assigning the squared error an infinitely large penalty factor with early stopping as a form of regularization, The effectiveness of the approach is demonstrated on trajectory learning of the double scroll attractor in Chua's circuit.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

3.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据