4.6 Article

Echo State Networks trained by Tikhonov least squares are L2(μ) approximators of ergodic dynamical systems

期刊

PHYSICA D-NONLINEAR PHENOMENA
卷 421, 期 -, 页码 -

出版社

ELSEVIER
DOI: 10.1016/j.physd.2021.132882

关键词

Reservoir computing; Liquid state machine; Time series analysis; Lorenz equations; Delay embedding; Recurrent neural networks

资金

  1. EPSRC, UK Centre for Doctoral Training in Statistical Applied Mathematics at Bath (SAMBa) [EP/L015684/1]

向作者/读者索取更多资源

Echo State Networks (ESNs) are single-layer recurrent neural networks trained by regularised linear least squares regression, which can approximate target functions effectively. The numerical experiments on the Lorenz system demonstrate the validity and feasibility of ESN.
Echo State Networks (ESNs) are a class of single-layer recurrent neural networks with randomly generated internal weights, and a single layer of tuneable outer weights, which are usually trained by regularised linear least squares regression. Remarkably, ESNs still enjoy the universal approximation property despite the training procedure being entirely linear. In this paper, we prove that an ESN trained on a sequence of observations from an ergodic dynamical system (with invariant measure mu) using Tikhonov least squares regression against a set of targets, will approximate the target function in the L-2(mu) norm. In the special case that the targets are future observations, the ESN is learning the next step map, which allows time series forecasting. We demonstrate the theory numerically by training an ESN using Tikhonov least squares on a sequence of scalar observations of the Lorenz system. (C) 2021 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据