期刊
AUTOMATICA
卷 43, 期 1, 页码 1-14出版社
PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.automatica.2006.07.024
关键词
recursive identification; parameter estimation; stochastic gradient; convergence properties; forgetting factors; stochastic processes
It is well-known that the stochastic gradient (SG) identification algorithm has poor convergence rate. In order to improve the convergence rate, we extend the SG algorithm from the viewpoint of innovation modification and present multi-innovation gradient type identification algorithms, including a multi-innovation stochastic gradient (MISG) algorithm and a multi-innovation forgetting gradient (MIFG) algorithm. Because the multi-innovation gradient type algorithms use not only the current data but also the past data at each iteration, parameter estimation accuracy can be improved. Finally, the performance analysis and simulation results show that the proposed MISG and MIFG algorithms have faster convergence rates and better tracking performance than their corresponding SG algorithms. (c) 2006 Elsevier Ltd. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据