4.7 Article

Data-informed reservoir computing for efficient time-series prediction

相关参考文献

注意:仅列出部分参考文献,下载原文获取全部文献信息。
Article Computer Science, Artificial Intelligence

Limitations of the Recall Capabilities in Delay-Based Reservoir Computing Systems

Felix Koester et al.

Summary: This paper analyzes the memory capacity of a delay-based reservoir computer using a Hopf normal form as nonlinearity, and calculates its linear as well as higher order recall capabilities. The results show that the total memory capacity is dependent on the ratio between the information input period and the time delay in the system.

COGNITIVE COMPUTATION (2023)

Article Mathematics, Applied

Effect of temporal resolution on the reproduction of chaotic dynamics via reservoir computing

Kohei Tsuchiyama et al.

Summary: Reservoir computing is a machine learning paradigm that utilizes a reservoir structure with nonlinearities and short-term memory. It has expanded to various functions including autonomous generation of chaotic time series, time series prediction, and classification. Sampling plays a crucial role in physical reservoir computers, but finding the suitable sampling frequency is essential for effectively regenerating chaotic time series.
Article Mathematics, Applied

Scale dependence of fractal dimension in deterministic and stochastic Lorenz-63 systems

T. Alberti et al.

Summary: Natural systems exhibit emergent phenomena at different scales, with chaotic behavior at large scales and randomness at small scales. The properties of the underlying attractor, which hosts the system trajectories, are usually studied quantitatively to understand these features. However, the multi-scale nature of natural systems makes it difficult to obtain a clear picture of the attracting set. In this study, we use an adaptive decomposition method and extreme value theory to analyze the scale-dependent dimension of the attractor, showing that it can discriminate between different types of noise.
Article Nanoscience & Nanotechnology

Deriving task specific performance from the information processing capacity of a reservoir computer

Tobias Huelser et al.

Summary: This article investigates the relationship between information processing capacity and task performance, finding poor correlation between them. A new method for calculating task mean square error is proposed, and it is found that there is good consistency between predicted and actual errors as long as the task input sequences do not have long autocorrelation times.

NANOPHOTONICS (2023)

Article Physics, Multidisciplinary

Catch-22s of reservoir computing

Yuanzhao Zhang et al.

Summary: Reservoir computing is a model-free framework for predicting the behavior of nonlinear dynamical systems. However, it struggles to learn the dynamics of some systems unless key information is known. Next-generation reservoir computing can accurately predict the basins of attraction of a system, but small uncertainty in nonlinearity can reduce the prediction accuracy.

PHYSICAL REVIEW RESEARCH (2023)

Article Mathematics, Applied

Optimizing memory in reservoir computers

T. L. Carroll

Summary: A reservoir computer is a computational approach that utilizes a high dimensional dynamical system by connecting nonlinear nodes into a network, allowing for memory and feedback. The fading memory duration is crucial for the reservoir computer's ability to solve specific problems effectively and efficiently.
Article Meteorology & Atmospheric Sciences

A Hybrid Approach to Atmospheric Modeling That Combines Machine Learning With a Physics-Based Numerical Model

Troy Arcomano et al.

Summary: This paper describes the implementation of a combined hybrid-parallel prediction approach on a low-resolution atmospheric global circulation model. The hybrid model, which combines a physics-based numerical model with a machine learning component, produces more accurate forecasts for various atmospheric variables compared to the host model. Furthermore, the hybrid model exhibits smaller systematic errors and more realistic temporal variability in simulating the climate.

JOURNAL OF ADVANCES IN MODELING EARTH SYSTEMS (2022)

Review Computer Science, Interdisciplinary Applications

Review of ML and AutoML Solutions to Forecast Time-Series Data

Ahmad Alsahref et al.

Summary: Time-series forecasting is an important discipline in data modeling that utilizes past observations to predict future values. This article reviews the methods of analyzing time series, from traditional linear modeling techniques to automated machine learning (AutoML) frameworks and deep learning models. The objective is to identify the challenges of time-series forecasting and the techniques used to address them. The article serves as a guide and reference for researchers and industries using AutoML for forecasting, while also highlighting gaps in previous works and forecasting techniques.

ARCHIVES OF COMPUTATIONAL METHODS IN ENGINEERING (2022)

Article Multidisciplinary Sciences

Kernel learning for robust dynamic mode decomposition: linear and nonlinear disambiguation optimization

Peter J. Baddoo et al.

Summary: Research in modern data-driven dynamical systems tackles the challenges of high dimensionality, unknown dynamics, and nonlinearity. This work presents a kernel method for learning interpretable data-driven models for high-dimensional, nonlinear systems. The method efficiently handles high-dimensional data and incorporates partial knowledge of system physics.

PROCEEDINGS OF THE ROYAL SOCIETY A-MATHEMATICAL PHYSICAL AND ENGINEERING SCIENCES (2022)

Article Computer Science, Artificial Intelligence

Chain-Structure Echo State Network With Stochastic Optimization: Methodology and Application

Zhou Wu et al.

Summary: In this article, a new chain-structure echo state network (CESN) is proposed for multivariate time series prediction, utilizing the philosophy of "divide and conquer" to divide input vectors into clusters. The network is trained using least-squares regression and stochastic local search (SLS) to optimize the output weights and minimize the loss function, effectively preventing overfitting. The effectiveness and robustness of CESN and SLS-CESN are verified through chaos prediction benchmarks and real applications.

IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS (2022)

Article Mathematics, Applied

Time shifts to reduce the size of reservoir computers

Thomas L. Carroll et al.

Summary: A reservoir computer is a powerful computing system constructed by connecting a large number of nonlinear nodes, which can achieve accurate results and operate at high speeds. However, the complexity of its construction and the requirement for a high number of nonlinear nodes make it challenging. This study proposes a time-shifting technique that divides the reservoir computer into a small set of nonlinear nodes and a separate set of time-shifted reservoir output signals, which greatly simplifies the construction process and improves the performance of the reservoir computer.
Article Mathematics, Applied

Learning unseen coexisting attractors

Daniel J. Gauthier et al.

Summary: Reservoir computing is a machine learning approach that can generate a surrogate model of a dynamical system. The next-generation reservoir computing approach simplifies training further and exhibits higher accuracy in predicting attractor characteristics compared to traditional approaches.
Article Mathematics, Applied

Model-free prediction of multistability using echo state network

Mousumi Roy et al.

Summary: In this article, a data-driven approach using echo state network (ESN) is investigated to infer the dynamics of multistable systems. The machine is able to predict diverse dynamics for different parameter values, even at distant parameters from the training dynamics. The whole bifurcation diagram can also be accurately predicted. Additionally, the study extends to exploring the dynamics of co-existing attractors at unknown parameter values and identifying the basins for different attractors.
Article Mathematics, Applied

Learning spatiotemporal chaos using next-generation reservoir computing

Wendson A. S. Barbosa et al.

Summary: Forecasting the behavior of high-dimensional dynamical systems using machine learning requires efficient methods to learn the underlying physical model. This study demonstrates spatiotemporal chaos prediction using a machine learning architecture combined with a next-generation reservoir computer, which achieves state-of-the-art performance with significantly faster training process and smaller training data set compared to other machine learning algorithms. The computational cost and training data are further reduced by exploiting the translational symmetry of the model.
Article Optics

Reservoir computing based on an external-cavity semiconductor laser with optical feedback modulation

Kazutaka Kanno et al.

Summary: In this study, reservoir computing based on a single semiconductor laser with optical feedback modulation is investigated numerically and experimentally. The results show that the performance of the phase modulation scheme is better than that of the intensity modulation scheme, and the prediction error improves for large injection currents. The physical origin of the superior performance of the phase modulation scheme is analyzed using steady-state analysis in the phase space.

OPTICS EXPRESS (2022)

Article Physics, Multidisciplinary

Spatiotemporal Transformer Neural Network for Time-Series Forecasting

Yujie You et al.

Summary: This study proposes a novel spatiotemporal transformer neural network (STNN) for efficient prediction of high-dimensional short-term time-series, which utilizes a continuous attention mechanism and various attention mechanisms to integrate and predict effective information. Experimental results demonstrate that STNN significantly outperforms existing methods in multi-step forecasting.

ENTROPY (2022)

Proceedings Paper Computer Science, Artificial Intelligence

Photonic reservoir computing with non-linear memory cells: Interplay between topology, delay and delayed input

Lina C. Jaurigue et al.

Summary: This article discusses the performance of photonic reservoir computing, focusing on the impact of delay lines and the interplay between coupling topology and performance for various benchmark tasks. The study shows that additional delayed input can be beneficial for reservoir computing setups, as it provides an easy tuning parameter to improve the performance on a range of tasks.

EMERGING TOPICS IN ARTIFICIAL INTELLIGENCE (ETAI) 2022 (2022)

Article Computer Science, Artificial Intelligence

Master Memory Function for Delay-Based Reservoir Computers With Single-Variable Dynamics

Felix Koester et al.

Summary: This study demonstrates that delay-based reservoir computers can be characterized by a universal master memory function (MMF) and provides linear memory capacity. An analytical description of the MMF is proposed for efficient computing and can be applied to various reservoir scenarios.

IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS (2022)

Article Physics, Multidisciplinary

Reservoir Computing with Delayed Input for Fast and Easy Optimisation

Lina Jaurigue et al.

Summary: Reservoir computing is a machine learning method that utilizes the response of a dynamical system to solve tasks, particularly suited for hardware implementation and effective in time series prediction tasks. While still requiring parameter optimization, including a time-delayed version of the input can improve performance significantly.

ENTROPY (2021)

Article Multidisciplinary Sciences

Domain-driven models yield better predictions at lower cost than reservoir computers in Lorenz systems

Ryan Pyle et al.

Summary: Recent advancements in computing algorithms and hardware have led to increased interest in developing high-accuracy, low-cost surrogate models for simulating physical systems. The echo state network (ESN) technique has gained popularity within the weather and climate modeling community. A study found that state-of-the-art LSR-ESNs reduce to a polynomial regression model called D2R2, which outperforms other approaches significantly in computational savings.

PHILOSOPHICAL TRANSACTIONS OF THE ROYAL SOCIETY A-MATHEMATICAL PHYSICAL AND ENGINEERING SCIENCES (2021)

Article Mathematics, Applied

Echo State Networks trained by Tikhonov least squares are L2(μ) approximators of ergodic dynamical systems

Allen G. Hart et al.

Summary: Echo State Networks (ESNs) are single-layer recurrent neural networks trained by regularised linear least squares regression, which can approximate target functions effectively. The numerical experiments on the Lorenz system demonstrate the validity and feasibility of ESN.

PHYSICA D-NONLINEAR PHENOMENA (2021)

Article Mathematics, Applied

Using data assimilation to train a hybrid forecast system that combines machine-learning and knowledge-based components

Alexander Wikner et al.

Summary: This study discusses the forecasting of chaotic dynamical systems using noisy partial measurements data, with a focus on combining machine learning with knowledge-based models to improve predictions. By assimilating synthetic data and training machine learning models with partial measurements, it shows potential to correct imperfections in knowledge-based models and improve forecasting accuracy.
Article Engineering, Electrical & Electronic

Comprehensive Performance Analysis of a VCSEL-Based Photonic Reservoir Computer

Julian Bueno et al.

Summary: Optical neural networks using Reservoir Computing technology combined with Vertical Cavity Surface Emitting Lasers show high performance and potential in optical neural network implementations. VCSELs have unique advantages for future photonic neural networks, such as high speed, low power consumption, reduced cost, and ease of integration.

IEEE PHOTONICS TECHNOLOGY LETTERS (2021)

Article Multidisciplinary Sciences

Next generation reservoir computing

Daniel J. Gauthier et al.

Summary: Reservoir computers are artificial neural networks that can be trained on small data sets with large random matrices and numerous metaparameters. Nonlinear vector autoregression is a superior machine learning algorithm compared to reservoir computing, requiring fewer training data sets and training time.

NATURE COMMUNICATIONS (2021)

Article Computer Science, Artificial Intelligence

Teaching recurrent neural networks to infer global temporal structure from local examples

Jason Z. Kim et al.

Summary: Computational systems are designed to store and manipulate information, while neurobiological systems adapt to perform similar functions without explicit engineering. Recent research has shown that recurrent neural networks (RNNs) can learn to modify complex information representations using examples and control signals, allowing for continuous interpolation and extrapolation of these representations far beyond the training data. Furthermore, RNNs can infer bifurcation structures and chaos routes, as well as extrapolate non-dynamical trajectories.

NATURE MACHINE INTELLIGENCE (2021)

Article Optics

Insight into delay based reservoir computing via eigenvalue analysis

Felix Koester et al.

Summary: This paper provides a deep insight into the computation capability of delay based reservoir computing through an eigenvalue analysis and finds a connection between task-independent memory capacity and the eigenvalue spectrum of the dynamical system. The performance of reservoir computing can be predicted by analyzing the small signal response of the reservoir. Analysis can be applied to any dynamical system used as a reservoir.

JOURNAL OF PHYSICS-PHOTONICS (2021)

Article Multidisciplinary Sciences

Assessing the scales in numerical weather and climate predictions: will exascale be the rescue?

Philipp Neumann et al.

PHILOSOPHICAL TRANSACTIONS OF THE ROYAL SOCIETY A-MATHEMATICAL PHYSICAL AND ENGINEERING SCIENCES (2019)

Article Physics, Multidisciplinary

Model-Free Prediction of Large Spatiotemporally Chaotic Systems from Data: A Reservoir Computing Approach

Jaideep Pathak et al.

PHYSICAL REVIEW LETTERS (2018)

Article Multidisciplinary Sciences

Data-driven forecasting of high-dimensional chaotic systems with long short-term memory networks

Pantelis R. Vlachas et al.

PROCEEDINGS OF THE ROYAL SOCIETY A-MATHEMATICAL PHYSICAL AND ENGINEERING SCIENCES (2018)

Article Optics

Compact reservoir computing with a photonic integrated circuit

Kosuke Takano et al.

OPTICS EXPRESS (2018)

Article Computer Science, Artificial Intelligence

Genetic algorithm optimized double-reservoir echo state network for multi-regime time series prediction

Shisheng Zhong et al.

NEUROCOMPUTING (2017)

Article Computer Science, Artificial Intelligence

Reservoir Computing with an Ensemble of Time-Delay Reservoirs

Silvia Ortin et al.

COGNITIVE COMPUTATION (2017)

Article Computer Science, Artificial Intelligence

Real-time Audio Processing with a Cascade of Discrete-Time Delay Line-Based Reservoir Computers

Lars Keuninckx et al.

COGNITIVE COMPUTATION (2017)

Article Multidisciplinary Sciences

Discovering governing equations from data by sparse identification of nonlinear dynamical systems

Steven L. Brunton et al.

PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA (2016)

Article Multidisciplinary Sciences

Optimal nonlinear information processing capacity in delay-based reservoir computers

Lyudmila Grigoryeva et al.

SCIENTIFIC REPORTS (2015)

Article Multidisciplinary Sciences

Parallel photonic information processing at gigabyte per second data rates using transient states

Daniel Brunner et al.

NATURE COMMUNICATIONS (2013)

Article Multidisciplinary Sciences

Optoelectronic Reservoir Computing

Y. Paquot et al.

SCIENTIFIC REPORTS (2012)

Article Multidisciplinary Sciences

Information Processing Capacity of Dynamical Systems

Joni Dambre et al.

SCIENTIFIC REPORTS (2012)

Article Multidisciplinary Sciences

Information processing using a single dynamical node as complex system

L. Appeltant et al.

NATURE COMMUNICATIONS (2011)

Article Mathematics, Applied

Fourth-order time-stepping for stiff PDEs

AK Kassam et al.

SIAM JOURNAL ON SCIENTIFIC COMPUTING (2005)

Article Computer Science, Artificial Intelligence

Real-time computing without stable states:: A new framework for neural computation based on perturbations

W Maass et al.

NEURAL COMPUTATION (2002)