IDEAS home Printed from https://ideas.repec.org/a/eee/appene/v402y2025ipas0306261925016320.html

UpdatingEMS: An online updating framework for deep reinforcement learning-based energy management of fuel cell hybrid electric bus with integrated transfer learning

Author

Listed:
  • Huang, Ruchen
  • He, Hongwen

Abstract

Deep reinforcement learning (DRL) holds great promise in enhancing the effectiveness of energy management strategies (EMSs) for hybrid electric vehicles (HEVs). However, online updating of the DRL-based EMSs remains a challenge, making it difficult to ensure their long-term optimization performance. Given that, this study proposes an online updating EMS to improve the long-term energy efficiency of the DRL-based EMS for a fuel cell hybrid electric bus, by exploiting the correlation mechanism between real-time traffic information and efficient hydrogen utilization. Specifically, future optimal safety speed is planned by adopting dynamic programming addressing coupled spatiotemporal constraints in traffic information. Furthermore, a knowledge-sharing mechanism is developed by leveraging transfer learning (TL) to reuse historical EMS for the planned future speed, enabling the continuous updating of the soft actor-critic based EMS. Finally, the updated EMS is deployed into the onboard controller to verify the real-time control effect via the processor-in-the-loop experiment. Results demonstrate that the proposed EMS enhances updating efficiency by 30.08 % compared to the non-TL-integrated EMS and reduces hydrogen consumption by 6.11 % compared to the static EMS. Moreover, the updated EMS can be deployed in real time in the onboard controller. This study is a proof-of-concept demonstrating the EMS updating under idealized assumptions, making a contribution to intelligent transportation technologies on energy conservation from the aspect of energy management for HEVs

Suggested Citation

  • Huang, Ruchen & He, Hongwen, 2025. "UpdatingEMS: An online updating framework for deep reinforcement learning-based energy management of fuel cell hybrid electric bus with integrated transfer learning," Applied Energy, Elsevier, vol. 402(PA).
  • Handle: RePEc:eee:appene:v:402:y:2025:i:pa:s0306261925016320
    DOI: 10.1016/j.apenergy.2025.126902
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0306261925016320
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.apenergy.2025.126902?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to

    for a different version of it.

    References listed on IDEAS

    as
    1. Li, Jianwei & Liu, Jie & Yang, Qingqing & Wang, Tianci & He, Hongwen & Wang, Hanxiao & Sun, Fengchun, 2025. "Reinforcement learning based energy management for fuel cell hybrid electric vehicles: A comprehensive review on decision process reformulation and strategy implementation," Renewable and Sustainable Energy Reviews, Elsevier, vol. 213(C).
    2. Huang, Ruchen & He, Hongwen & Su, Qicong, 2024. "Towards a fossil-free urban transport system: An intelligent cross-type transferable energy management framework based on deep transfer reinforcement learning," Applied Energy, Elsevier, vol. 363(C).
    3. Xu, Hongyang & He, Hongwen & Yan, Mei & Wu, Jingda & Li, Menglin, 2025. "Hierarchical energy management for fuel cell buses: A graph-agent DRL framework bridging macroscopic traffic flow and microscopic powertrain dynamics," Energy, Elsevier, vol. 332(C).
    4. Ye, Tong & Huang, Yuping & Yang, Weijia & Cai, Guotian & Yang, Yuyao & Pan, Feng, 2025. "Safe multi-agent deep reinforcement learning for decentralized low-carbon operation in active distribution networks and multi-microgrids," Applied Energy, Elsevier, vol. 387(C).
    5. Li, Jie & Fotouhi, Abbas & Liu, Yonggang & Zhang, Yuanjian & Chen, Zheng, 2024. "Review on eco-driving control for connected and automated vehicles," Renewable and Sustainable Energy Reviews, Elsevier, vol. 189(PB).
    6. Liu, Bo & Sun, Chao & Wang, Bo & Liang, Weiqiang & Ren, Qiang & Li, Junqiu & Sun, Fengchun, 2022. "Bi-level convex optimization of eco-driving for connected Fuel Cell Hybrid Electric Vehicles through signalized intersections," Energy, Elsevier, vol. 252(C).
    7. Lei, Nuo & Zhang, Hao & Hu, Jingjing & Hu, Zunyan & Wang, Zhi, 2025. "Sim-to-real design and development of reinforcement learning-based energy management strategies for fuel cell electric vehicles," Applied Energy, Elsevier, vol. 393(C).
    8. Zou, Yuan & Liu, Teng & Liu, Dexing & Sun, Fengchun, 2016. "Reinforcement learning-based real-time energy management for a hybrid tracked vehicle," Applied Energy, Elsevier, vol. 171(C), pages 372-382.
    9. Wu, Peng & Partridge, Julius & Bucknall, Richard, 2020. "Cost-effective reinforcement learning energy management for plug-in hybrid fuel cell and battery ships," Applied Energy, Elsevier, vol. 275(C).
    10. Wang, Hanchen & Ye, Yiming & Zhang, Jiangfeng & Xu, Bin, 2023. "A comparative study of 13 deep reinforcement learning based energy management methods for a hybrid electric vehicle," Energy, Elsevier, vol. 266(C).
    11. Badji, Abderrezak & Obeid, Hussein & Hilairet, Mickael & Laghrouche, Salah & Abdeslam, Djaffar Ould & Djerdir, Abdesslem, 2025. "Enhanced energy management of fuel cell electric vehicles using integral sliding mode control and passivity-based control," Applied Energy, Elsevier, vol. 377(PD).
    12. Du, Guodong & Zou, Yuan & Zhang, Xudong & Guo, Lingxiong & Guo, Ningyuan, 2022. "Energy management for a hybrid electric vehicle based on prioritized deep reinforcement learning framework," Energy, Elsevier, vol. 241(C).
    13. Dong, Haoxuan & Shi, Junzhe & Zhuang, Weichao & Li, Zhaojian & Song, Ziyou, 2025. "Analyzing the impact of mixed vehicle platoon formations on vehicle energy and traffic efficiencies," Applied Energy, Elsevier, vol. 377(PA).
    14. Elia Kaufmann & Leonard Bauersfeld & Antonio Loquercio & Matthias Müller & Vladlen Koltun & Davide Scaramuzza, 2023. "Champion-level drone racing using deep reinforcement learning," Nature, Nature, vol. 620(7976), pages 982-987, August.
    15. Du, Guodong & Zou, Yuan & Zhang, Xudong & Kong, Zehui & Wu, Jinlong & He, Dingbo, 2019. "Intelligent energy management for hybrid electric tracked vehicles using online reinforcement learning," Applied Energy, Elsevier, vol. 251(C), pages 1-1.
    16. Togun, Hussein & Basem, Ali & Abdulrazzaq, Tuqa & Biswas, Nirmalendu & Abed, Azher M. & dhabab, Jameel M. & Chattopadhyay, Anirban & Slimi, Khalifa & Paul, Dipankar & Barmavatu, Praveen & Chrouda, Ama, 2025. "Development and comparative analysis between battery electric vehicles (BEV) and fuel cell electric vehicles (FCEV)," Applied Energy, Elsevier, vol. 388(C).
    17. Zhang, Chuntao & Huang, Wenhui & Zhou, Xingyu & Lv, Chen & Sun, Chao, 2024. "Expert-demonstration-augmented reinforcement learning for lane-change-aware eco-driving traversing consecutive traffic lights," Energy, Elsevier, vol. 286(C).
    18. Huang, Ruchen & He, Hongwen & Zhao, Xuyang & Wang, Yunlong & Li, Menglin, 2022. "Battery health-aware and naturalistic data-driven energy management for hybrid electric bus based on TD3 deep reinforcement learning algorithm," Applied Energy, Elsevier, vol. 321(C).
    19. Sun, Ziyi & Guo, Rong & Luo, Maohui, 2025. "Integrated energy-thermal management strategy for range extended electric vehicles based on soft actor-critic under low environment temperature," Energy, Elsevier, vol. 330(C).
    20. Volodymyr Mnih & Koray Kavukcuoglu & David Silver & Andrei A. Rusu & Joel Veness & Marc G. Bellemare & Alex Graves & Martin Riedmiller & Andreas K. Fidjeland & Georg Ostrovski & Stig Petersen & Charle, 2015. "Human-level control through deep reinforcement learning," Nature, Nature, vol. 518(7540), pages 529-533, February.
    21. Zhao, Yinghua & Huang, Siqi & Wang, Xiaoyu & Shi, Jingwu & Yao, Shouwen, 2024. "Energy management with adaptive moving average filter and deep deterministic policy gradient reinforcement learning for fuel cell hybrid electric vehicles," Energy, Elsevier, vol. 312(C).
    22. Jinquan, Guo & Hongwen, He & Jianwei, Li & Qingwu, Liu, 2021. "Real-time energy management of fuel cell hybrid electric buses: Fuel cell engines friendly intersection speed planning," Energy, Elsevier, vol. 226(C).
    23. Chang, Chengcheng & Zhao, Wanzhong & Wang, Chunyan & Luan, Zhongkai, 2023. "An energy management strategy of deep reinforcement learning based on multi-agent architecture under self-generating conditions," Energy, Elsevier, vol. 283(C).
    24. Liu, Zemin Eitan & Li, Yong & Zhou, Quan & Shuai, Bin & Hua, Min & Xu, Hongming & Xu, Lubing & Tan, Guikun & Li, Yanfei, 2025. "Real-time energy management for HEV combining naturalistic driving data and deep reinforcement learning with high generalization," Applied Energy, Elsevier, vol. 377(PA).
    25. Zhou, Jianhao & Xue, Siwu & Xue, Yuan & Liao, Yuhui & Liu, Jun & Zhao, Wanzhong, 2021. "A novel energy management strategy of hybrid electric vehicle via an improved TD3 deep reinforcement learning," Energy, Elsevier, vol. 224(C).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Daniel Egan & Qilun Zhu & Robert Prucka, 2023. "A Review of Reinforcement Learning-Based Powertrain Controllers: Effects of Agent Selection for Mixed-Continuity Control and Reward Formulation," Energies, MDPI, vol. 16(8), pages 1-31, April.
    2. Huang, Ruchen & He, Hongwen & Su, Qicong & Wu, Jingda, 2025. "Towards sustainable and intelligent urban transportation: A novel deep transfer reinforcement learning framework for eco-driving of fuel cell buses," Energy, Elsevier, vol. 330(C).
    3. Huang, Ruchen & He, Hongwen & Su, Qicong & Härtl, Martin & Jaensch, Malte, 2024. "Enabling cross-type full-knowledge transferable energy management for hybrid electric vehicles via deep transfer reinforcement learning," Energy, Elsevier, vol. 305(C).
    4. Li, Jianwei & Liu, Jie & Yang, Qingqing & Wang, Tianci & He, Hongwen & Wang, Hanxiao & Sun, Fengchun, 2025. "Reinforcement learning based energy management for fuel cell hybrid electric vehicles: A comprehensive review on decision process reformulation and strategy implementation," Renewable and Sustainable Energy Reviews, Elsevier, vol. 213(C).
    5. Wang, Hanchen & Ye, Yiming & Zhang, Jiangfeng & Xu, Bin, 2023. "A comparative study of 13 deep reinforcement learning based energy management methods for a hybrid electric vehicle," Energy, Elsevier, vol. 266(C).
    6. Liu, Yonggang & Wu, Yitao & Wang, Xiangyu & Li, Liang & Zhang, Yuanjian & Chen, Zheng, 2023. "Energy management for hybrid electric vehicles based on imitation reinforcement learning," Energy, Elsevier, vol. 263(PC).
    7. Huang, Ruchen & He, Hongwen & Su, Qicong & Härtl, Martin & Jaensch, Malte, 2025. "Type- and task-crossing energy management for fuel cell vehicles with longevity consideration: A heterogeneous deep transfer reinforcement learning framework," Applied Energy, Elsevier, vol. 377(PC).
    8. Nie, Zhigen & Feng, Yaxing & Lian, Yufeng, 2025. "Deep reinforcement learning-based hierarchical control strategy for energy management of intelligent fuel cell hybrid electric vehicles," Energy, Elsevier, vol. 326(C).
    9. Liu, Zemin Eitan & Li, Yong & Zhou, Quan & Shuai, Bin & Hua, Min & Xu, Hongming & Xu, Lubing & Tan, Guikun & Li, Yanfei, 2025. "Real-time energy management for HEV combining naturalistic driving data and deep reinforcement learning with high generalization," Applied Energy, Elsevier, vol. 377(PA).
    10. Zhang, Hao & Yang, Guixiang & Lei, Nuo & Chen, Chaoyi & Chen, Boli & Qiu, Lin, 2025. "Scenario-aware electric vehicle energy control with enhanced vehicle-to-grid capability: A multi-task reinforcement learning approach," Energy, Elsevier, vol. 335(C).
    11. Huang, Ruchen & He, Hongwen & Su, Qicong, 2024. "Smart energy management for hybrid electric bus via improved soft actor-critic algorithm in a heuristic learning framework," Energy, Elsevier, vol. 309(C).
    12. Mahmud, Sakib & Sayed, Aya Nabil & Himeur, Yassine & Nhlabatsi, Armstrong & Bensaali, Faycal, 2026. "A comprehensive review of deep reinforcement learning applications from centralized power generation to modern energy internet frameworks," Renewable and Sustainable Energy Reviews, Elsevier, vol. 226(PE).
    13. Fan Wang & Yina Hong & Xiaohuan Zhao, 2025. "Research and Comparative Analysis of Energy Management Strategies for Hybrid Electric Vehicles: A Review," Energies, MDPI, vol. 18(11), pages 1-28, May.
    14. Yazar, Ozan & Coskun, Serdar & Zhang, Fengqi & Li, Lin & Huang, Cong & Mei, Peng & Karimi, Hamid Reza, 2025. "A novel energy management strategy for hybrid electric vehicles using deep reinforcement incentive learning," Energy, Elsevier, vol. 334(C).
    15. Niu, Zegong & He, Hongwen, 2024. "A data-driven solution for intelligent power allocation of connected hybrid electric vehicles inspired by offline deep reinforcement learning in V2X scenario," Applied Energy, Elsevier, vol. 372(C).
    16. Chen, Sihan & Huang, Yin & Zhang, Jie & Yu, Xinshu & Lu, Yifan & Xuan, Dongji, 2025. "Research on a novel multi-agent deep reinforcement learning eco-driving framework," Energy, Elsevier, vol. 326(C).
    17. Tang, Xiaolin & Zhou, Haitao & Wang, Feng & Wang, Weida & Lin, Xianke, 2022. "Longevity-conscious energy management strategy of fuel cell hybrid electric Vehicle Based on deep reinforcement learning," Energy, Elsevier, vol. 238(PA).
    18. Li, Xueliang & Liu, Yilong & Yan, Mei & Tian, Dayu & Yang, Shujun & Peng, Zengxiong, 2026. "Data-driven a convergence-enhanced fusion energy management strategy based on teacher agent guidance for hybrid electric vehicles," Applied Energy, Elsevier, vol. 404(C).
    19. Liu, Hui & You, Congwen & Han, Lijin & Yang, Ningkang & Liu, Baoshuai, 2025. "Off-road hybrid electric vehicle energy management strategy using multi-agent soft actor-critic with collaborative-independent algorithm," Energy, Elsevier, vol. 328(C).
    20. Li, Menglin & Yin, Long & Yan, Mei & Wu, Jingda & He, Hongwe & Jia, Chunchun, 2024. "Hierarchical intelligent energy-saving control strategy for fuel cell hybrid electric buses based on traffic flow predictions," Energy, Elsevier, vol. 304(C).

    More about this item

    Keywords

    ;
    ;
    ;
    ;
    ;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:appene:v:402:y:2025:i:pa:s0306261925016320. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/405891/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.