IDEAS home Printed from https://ideas.repec.org/a/eee/appene/v404y2026ics0306261925018598.html

Data-driven a convergence-enhanced fusion energy management strategy based on teacher agent guidance for hybrid electric vehicles

Author

Listed:
  • Li, Xueliang
  • Liu, Yilong
  • Yan, Mei
  • Tian, Dayu
  • Yang, Shujun
  • Peng, Zengxiong

Abstract

Learning-based energy management strategy is considered to be one of the most promising vehicle energy-saving technologies, but the problems of low sampling efficiency and slow convergence rate of its training process seriously restrict the large-scale development. To address this problem, this paper proposes a fusion energy management strategy that combines the prior knowledge with deep learning, which is guided by the teacher strategy (rule-based and equivalent consumption minimization strategy), to solve the problems of poor strategy convergence effect and slow convergence rate caused by inconsistent convergence direction of intelligent agents. In this paper, we present a fusion-based energy management strategy with teacher agent guidance, which combines the already mature rule-based and equivalent consumption minimization strategy and guides the convergence direction of the intelligent agent. By incorporating prior knowledge from the teacher agent, the proposed energy management strategy initially constrains the exploration direction of the agent with prior knowledge. During iterations, the influence of prior knowledge on the agent is adjusted through dynamic weight, forming three stages: “constraint - transition - autonomy”. This enables it to both improve convergence speed and achieve better fuel economy under different working conditions and expert experiences. Simulation results show that the method improves the convergence rate from 40.1 % to 42.7 % and the fuel economy from 1.2 % to 1.9 % compared to the traditional learning method, reaching 97.9 % to 98.7 % of the dynamic programming algorithm. In addition, the proposed method can be easily popularized for other types of hybrid vehicles.

Suggested Citation

  • Li, Xueliang & Liu, Yilong & Yan, Mei & Tian, Dayu & Yang, Shujun & Peng, Zengxiong, 2026. "Data-driven a convergence-enhanced fusion energy management strategy based on teacher agent guidance for hybrid electric vehicles," Applied Energy, Elsevier, vol. 404(C).
  • Handle: RePEc:eee:appene:v:404:y:2026:i:c:s0306261925018598
    DOI: 10.1016/j.apenergy.2025.127129
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0306261925018598
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.apenergy.2025.127129?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to

    for a different version of it.

    References listed on IDEAS

    as
    1. Ganesh, Akhil Hannegudda & Xu, Bin, 2022. "A review of reinforcement learning based energy management systems for electrified powertrains: Progress, challenge, and potential solution," Renewable and Sustainable Energy Reviews, Elsevier, vol. 154(C).
    2. Lian, Renzong & Peng, Jiankun & Wu, Yuankai & Tan, Huachun & Zhang, Hailong, 2020. "Rule-interposing deep reinforcement learning based energy management strategy for power-split hybrid electric vehicle," Energy, Elsevier, vol. 197(C).
    3. Guo, Ningyuan & Zhang, Xudong & Zou, Yuan & Guo, Lingxiong & Du, Guodong, 2021. "Real-time predictive energy management of plug-in hybrid electric vehicles for coordination of fuel economy and battery degradation," Energy, Elsevier, vol. 214(C).
    4. Huang, Ruchen & He, Hongwen & Su, Qicong, 2024. "An intelligent full-knowledge transferable collaborative eco-driving framework based on improved soft actor-critic algorithm," Applied Energy, Elsevier, vol. 375(C).
    5. Wang, Yue & Li, Keqiang & Zeng, Xiaohua & Gao, Bolin & Hong, Jichao, 2023. "Investigation of novel intelligent energy management strategies for connected HEB considering global planning of fixed-route information," Energy, Elsevier, vol. 263(PB).
    6. Wu, Yuankai & Tan, Huachun & Peng, Jiankun & Zhang, Hailong & He, Hongwen, 2019. "Deep reinforcement learning of energy management with continuous control strategy and traffic information for a series-parallel plug-in hybrid electric bus," Applied Energy, Elsevier, vol. 247(C), pages 454-466.
    7. Xu, Bin & Rathod, Dhruvang & Zhang, Darui & Yebi, Adamu & Zhang, Xueyu & Li, Xiaoya & Filipi, Zoran, 2020. "Parametric study on reinforcement learning optimized energy management strategy for a hybrid electric vehicle," Applied Energy, Elsevier, vol. 259(C).
    8. Du, Guodong & Zou, Yuan & Zhang, Xudong & Liu, Teng & Wu, Jinlong & He, Dingbo, 2020. "Deep reinforcement learning based energy management for a hybrid electric vehicle," Energy, Elsevier, vol. 201(C).
    9. Huang, Ruchen & He, Hongwen & Su, Qicong, 2024. "Towards a fossil-free urban transport system: An intelligent cross-type transferable energy management framework based on deep transfer reinforcement learning," Applied Energy, Elsevier, vol. 363(C).
    10. Chen, Xu & Li, Mince & Chen, Zonghai, 2023. "Meta rule-based energy management strategy for battery/supercapacitor hybrid electric vehicles," Energy, Elsevier, vol. 285(C).
    11. Han, Xuefeng & He, Hongwen & Wu, Jingda & Peng, Jiankun & Li, Yuecheng, 2019. "Energy management based on reinforcement learning with double deep Q-learning for a hybrid electric tracked vehicle," Applied Energy, Elsevier, vol. 254(C).
    12. Huang, Ruchen & He, Hongwen & Su, Qicong & Härtl, Martin & Jaensch, Malte, 2024. "Enabling cross-type full-knowledge transferable energy management for hybrid electric vehicles via deep transfer reinforcement learning," Energy, Elsevier, vol. 305(C).
    13. Huang, Ruchen & He, Hongwen & Zhao, Xuyang & Wang, Yunlong & Li, Menglin, 2022. "Battery health-aware and naturalistic data-driven energy management for hybrid electric bus based on TD3 deep reinforcement learning algorithm," Applied Energy, Elsevier, vol. 321(C).
    14. Zhou, Jianhao & Xue, Siwu & Xue, Yuan & Liao, Yuhui & Liu, Jun & Zhao, Wanzhong, 2021. "A novel energy management strategy of hybrid electric vehicle via an improved TD3 deep reinforcement learning," Energy, Elsevier, vol. 224(C).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Huang, Ruchen & He, Hongwen & Su, Qicong, 2024. "Smart energy management for hybrid electric bus via improved soft actor-critic algorithm in a heuristic learning framework," Energy, Elsevier, vol. 309(C).
    2. Dong, Peng & Zhao, Junwei & Liu, Xuewu & Wu, Jian & Xu, Xiangyang & Liu, Yanfang & Wang, Shuhan & Guo, Wei, 2022. "Practical application of energy management strategy for hybrid electric vehicles based on intelligent and connected technologies: Development stages, challenges, and future trends," Renewable and Sustainable Energy Reviews, Elsevier, vol. 170(C).
    3. Daniel Egan & Qilun Zhu & Robert Prucka, 2023. "A Review of Reinforcement Learning-Based Powertrain Controllers: Effects of Agent Selection for Mixed-Continuity Control and Reward Formulation," Energies, MDPI, vol. 16(8), pages 1-31, April.
    4. Tang, Tianfeng & Peng, Qianlong & Shi, Qing & Peng, Qingguo & Zhao, Jin & Chen, Chaoyi & Wang, Guangwei, 2024. "Energy management of fuel cell hybrid electric bus in mountainous regions: A deep reinforcement learning approach considering terrain characteristics," Energy, Elsevier, vol. 311(C).
    5. Huang, Ruchen & He, Hongwen & Su, Qicong & Wu, Jingda, 2025. "Towards sustainable and intelligent urban transportation: A novel deep transfer reinforcement learning framework for eco-driving of fuel cell buses," Energy, Elsevier, vol. 330(C).
    6. Yang, Dongpo & Liu, Tong & Song, Dafeng & Zhang, Xuanming & Zeng, Xiaohua, 2023. "A real time multi-objective optimization Guided-MPC strategy for power-split hybrid electric bus based on velocity prediction," Energy, Elsevier, vol. 276(C).
    7. Qi, Chunyang & Song, Chuanxue & Xiao, Feng & Song, Shixin, 2022. "Generalization ability of hybrid electric vehicle energy management strategy based on reinforcement learning method," Energy, Elsevier, vol. 250(C).
    8. Liu, Teng & Tan, Wenhao & Tang, Xiaolin & Zhang, Jinwei & Xing, Yang & Cao, Dongpu, 2021. "Driving conditions-driven energy management strategies for hybrid electric vehicles: A review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 151(C).
    9. Li, Jianwei & Liu, Jie & Yang, Qingqing & Wang, Tianci & He, Hongwen & Wang, Hanxiao & Sun, Fengchun, 2025. "Reinforcement learning based energy management for fuel cell hybrid electric vehicles: A comprehensive review on decision process reformulation and strategy implementation," Renewable and Sustainable Energy Reviews, Elsevier, vol. 213(C).
    10. Matteo Acquarone & Claudio Maino & Daniela Misul & Ezio Spessa & Antonio Mastropietro & Luca Sorrentino & Enrico Busto, 2023. "Influence of the Reward Function on the Selection of Reinforcement Learning Agents for Hybrid Electric Vehicles Real-Time Control," Energies, MDPI, vol. 16(6), pages 1-22, March.
    11. Fang, Shuo & Hu, Shuangxi & Liu, Yuntao & Zhao, Chunhui & Wang, Ying, 2025. "Power management unit with maximum-efficiency-point-tracking to enhance the efficiency of micro DMFC stack," Energy, Elsevier, vol. 315(C).
    12. Huang, Ruchen & He, Hongwen & Su, Qicong & Härtl, Martin & Jaensch, Malte, 2025. "Type- and task-crossing energy management for fuel cell vehicles with longevity consideration: A heterogeneous deep transfer reinforcement learning framework," Applied Energy, Elsevier, vol. 377(PC).
    13. Liu, Zemin Eitan & Li, Yong & Zhou, Quan & Shuai, Bin & Hua, Min & Xu, Hongming & Xu, Lubing & Tan, Guikun & Li, Yanfei, 2025. "Real-time energy management for HEV combining naturalistic driving data and deep reinforcement learning with high generalization," Applied Energy, Elsevier, vol. 377(PA).
    14. Mahmud, Sakib & Sayed, Aya Nabil & Himeur, Yassine & Nhlabatsi, Armstrong & Bensaali, Faycal, 2026. "A comprehensive review of deep reinforcement learning applications from centralized power generation to modern energy internet frameworks," Renewable and Sustainable Energy Reviews, Elsevier, vol. 226(PE).
    15. Wang, Yue & Li, Keqiang & Zeng, Xiaohua & Gao, Bolin & Hong, Jichao, 2023. "Investigation of novel intelligent energy management strategies for connected HEB considering global planning of fixed-route information," Energy, Elsevier, vol. 263(PB).
    16. Marouane Adnane & Ahmed Khoumsi & João Pedro F. Trovão, 2023. "Efficient Management of Energy Consumption of Electric Vehicles Using Machine Learning—A Systematic and Comprehensive Survey," Energies, MDPI, vol. 16(13), pages 1-39, June.
    17. Feng, Zhiyan & Zhang, Qingang & Zhang, Yiming & Fei, Liangyu & Jiang, Fei & Zhao, Shengdun, 2024. "Practicability analysis of online deep reinforcement learning towards energy management strategy of 4WD-BEVs driven by dual-motor in-wheel motors," Energy, Elsevier, vol. 290(C).
    18. Yazar, Ozan & Coskun, Serdar & Zhang, Fengqi & Li, Lin & Huang, Cong & Mei, Peng & Karimi, Hamid Reza, 2025. "A novel energy management strategy for hybrid electric vehicles using deep reinforcement incentive learning," Energy, Elsevier, vol. 334(C).
    19. Hu, Dong & Xie, Hui & Song, Kang & Zhang, Yuanyuan & Yan, Long, 2023. "An apprenticeship-reinforcement learning scheme based on expert demonstrations for energy management strategy of hybrid electric vehicles," Applied Energy, Elsevier, vol. 342(C).
    20. Alessia Musa & Pier Giuseppe Anselma & Giovanni Belingardi & Daniela Anna Misul, 2023. "Energy Management in Hybrid Electric Vehicles: A Q-Learning Solution for Enhanced Drivability and Energy Efficiency," Energies, MDPI, vol. 17(1), pages 1-20, December.

    More about this item

    Keywords

    ;
    ;
    ;
    ;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:appene:v:404:y:2026:i:c:s0306261925018598. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/405891/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.