期刊
PROCESSES
卷 10, 期 6, 页码 -出版社
MDPI
DOI: 10.3390/pr10061080
关键词
plug-in hybrid electric bus; energy management; Q-learning; dynamic SOC design zone; hardware in loop simulation
This paper proposes a reinforcement learning-based energy management method for plug-in hybrid electric buses, which combines different algorithms and introduces a dynamic SOC design zone plan method. The experimental results demonstrate that this method performs well in energy management and fuel consumption.
The main problem in current energy management is the ability of practical application. To address the problem, this paper proposes a reinforcement learning (RL)-based energy management by combining Tubule Q-learning and Pontryagin's Minimum Principle (PMP) algorithms for a plug-in hybrid electric bus (PHEB). The main innovation distinguished from the existing energy management strategies is that a dynamic SOC design zone plan method is proposed. It is characterized by two aspects: (1) a series of fixed locations are defined in the city bus route and a linear SOC reference trajectory is re-planned at fixed locations; (2) a triangle zone will be re-planned based on the linear SOC reference trajectory. Additionally, a one-dimensional state space is also designed to ensure the real-time control. The off-line trainings demonstrate that the agent of the RL-based energy management can be well trained and has good generalization performance. The results of hardware in loop simulation (HIL) demonstrate that the trained energy management has good real-time performance, and its fuel consumption can be decreased by 12.92%, compared to a rule-based control strategy.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据