4.7 Article

Optimizing long short-term memory recurrent neural networks using ant colony optimization to predict turbine engine vibration

期刊

APPLIED SOFT COMPUTING
卷 73, 期 -, 页码 969-991

出版社

ELSEVIER
DOI: 10.1016/j.asoc.2018.09.013

关键词

Ant colony optimization (ACO); Long short term memory recurrent neural network (LSTM); Recurrent neural network (RNN); Time series prediction; Aviation; Aerospace engineering; Turbomachinery; Turbine engine vibration; Flight parameters prediction

向作者/读者索取更多资源

This article expands on research that has been done to develop a recurrent neural network (RNN) capable of predicting aircraft engine vibrations using long short-term memory (LSTM) neurons. LSTM RNNs can provide a more generalizable and robust method for prediction over analytical calculations of engine vibration, as analytical calculations must be solved iteratively based on specific empirical engine parameters, making this approach ungeneralizable across multiple engines. In initial work, multiple LSTM RNN architectures were proposed, evaluated and compared. This research improves the performance of the most effective LSTM network design proposed in the previous work by using a promising neuroevolution method based on ant colony optimization (ACO) to develop and enhance the LSTM cell structure of the network. A parallelized version of the ACO neuroevolution algorithm has been developed and the evolved LSTM RNNs were compared to the previously used fixed topology. The evolved networks were trained on a large database of flight data records obtained from an airline containing flights that suffered from excessive vibration. Results were obtained using MPI (Message Passing Interface) on a high performance computing (HPC) cluster, evolving 1000 different LSTM cell structures using 208 cores over 21 days. The new evolved LSTM cells showed an improvement of 1.34%, reducing the mean prediction error from 5.61% to 4.27% when predicting excessive engine vibrations 10 s in the future, while at the same time dramatically reducing the number of weights from 21,170 to 13,150. The optimized LSTM also performed significantly better than traditional Nonlinear Output Error (NOE), Nonlinear AutoRegression with eXogenous (NARX) inputs, and Nonlinear Box-Jenkins (NBJ) models, which only reached error rates of 15.73%, 12.06% and 15.05%, respectively. Furthermore, the LSTM regularization method was used to validate the ACO. The ACO LSTM out performed the regularized LSTM by 3.35%. The NOE, NARX, and NBJ models were also regularized for cross validation, and the mean prediction errors were 8.70%, 9.40%, and 9.43% respectively, which gives credit for the ant colony optimized LSTM RNN. (C) 2018 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据