4.6 Article

Transfer Deep Reinforcement Learning-Enabled Energy Management Strategy for Hybrid Tracked Vehicle

Journal

IEEE ACCESS
Volume 8, Issue -, Pages 165837-165848

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/ACCESS.2020.3022944

Keywords

Energy management; Batteries; Generators; Resistance; Mechanical power transmission; Engines; Training; Deep reinforcement learning; transfer learning; hybrid tracked vehicle; energy management strategy; deep deterministic policy gradient

Funding

  1. Chongqing Science and Technology Project [cstc2019jcyj-msxmX0636, cstc2019jcyj-msxmX0481]
  2. Yangtze Normal University [2016KYQD16]
  3. Chongqing Education Commission [KJ1712297, KJQN201901321]

Ask authors/readers for more resources

This paper proposes an adaptive energy management strategy for hybrid electric vehicles by combining deep reinforcement learning (DRL) and transfer learning (TL). This work aims to address the defect of DRL in tedious training time. First, an optimization control modeling of a hybrid tracked vehicle is built, wherein the elaborate powertrain components are introduced. Then, a bi-level control framework is constructed to derive the energy management strategies (EMSs). The upper-level is applying the particular deep deterministic policy gradient (DDPG) algorithms for EMS training at different speed intervals. The lower-level is employing the TL method to transform the pre-trained neural networks for a novel driving cycle. Finally, a series of experiments are executed to prove the effectiveness of the presented control framework. The optimality and adaptability of the formulated EMS are illuminated. The founded DRL and TL-enabled control policy is capable of enhancing energy efficiency and improving system performance.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available