4.8 Article

Reinforcement Learning of Adaptive Energy Management With Transition Probability for a Hybrid Electric Tracked Vehicle

Journal

IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS
Volume 62, Issue 12, Pages 7837-7846

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIE.2015.2475419

Keywords

Adaptability; energy management; hybrid electric tracked vehicle (HETV); Q-learning algorithm; state of charge (SOC); stochastic dynamic programming (SDP)

Funding

  1. National Natural Science Foundation of China [51375044]
  2. 111 Project [B12022]

Ask authors/readers for more resources

A reinforcement learning-based adaptive energy management (RLAEM) is proposed for a hybrid electric tracked vehicle (HETV) in this paper. A control oriented model of the HETV is first established, in which the state-of-charge (SOC) of battery and the speed of generator are the state variables, and the engine's torque is the control variable. Subsequently, a transition probability matrix is learned from a specific driving schedule of the HETV. The proposed RLAEM decides appropriate power split between the battery and engine-generator set (EGS) to minimize the fuel consumption over different driving schedules. With the RLAEM, not only is driver's power requirement guaranteed, but also the fuel economy is improved as well. Finally, the RLAEM is compared with the stochastic dynamic programming (SDP)-based energy management for different driving schedules. The simulation results demonstrate the adaptability, optimality, and learning ability of the RLAEM and its capacity of reducing the computation time.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available