UpdatingEMS: An online updating framework for deep reinforcement learning-based energy management of fuel cell hybrid electric bus with integrated transfer learning
Author
Abstract
Suggested Citation
DOI: 10.1016/j.apenergy.2025.126902
Download full text from publisher
As the access to this document is restricted, you may want to
for a different version of it.References listed on IDEAS
- Li, Jianwei & Liu, Jie & Yang, Qingqing & Wang, Tianci & He, Hongwen & Wang, Hanxiao & Sun, Fengchun, 2025. "Reinforcement learning based energy management for fuel cell hybrid electric vehicles: A comprehensive review on decision process reformulation and strategy implementation," Renewable and Sustainable Energy Reviews, Elsevier, vol. 213(C).
- Huang, Ruchen & He, Hongwen & Su, Qicong, 2024. "Towards a fossil-free urban transport system: An intelligent cross-type transferable energy management framework based on deep transfer reinforcement learning," Applied Energy, Elsevier, vol. 363(C).
- Xu, Hongyang & He, Hongwen & Yan, Mei & Wu, Jingda & Li, Menglin, 2025. "Hierarchical energy management for fuel cell buses: A graph-agent DRL framework bridging macroscopic traffic flow and microscopic powertrain dynamics," Energy, Elsevier, vol. 332(C).
- Ye, Tong & Huang, Yuping & Yang, Weijia & Cai, Guotian & Yang, Yuyao & Pan, Feng, 2025. "Safe multi-agent deep reinforcement learning for decentralized low-carbon operation in active distribution networks and multi-microgrids," Applied Energy, Elsevier, vol. 387(C).
- Li, Jie & Fotouhi, Abbas & Liu, Yonggang & Zhang, Yuanjian & Chen, Zheng, 2024. "Review on eco-driving control for connected and automated vehicles," Renewable and Sustainable Energy Reviews, Elsevier, vol. 189(PB).
- Liu, Bo & Sun, Chao & Wang, Bo & Liang, Weiqiang & Ren, Qiang & Li, Junqiu & Sun, Fengchun, 2022. "Bi-level convex optimization of eco-driving for connected Fuel Cell Hybrid Electric Vehicles through signalized intersections," Energy, Elsevier, vol. 252(C).
- Lei, Nuo & Zhang, Hao & Hu, Jingjing & Hu, Zunyan & Wang, Zhi, 2025. "Sim-to-real design and development of reinforcement learning-based energy management strategies for fuel cell electric vehicles," Applied Energy, Elsevier, vol. 393(C).
- Zou, Yuan & Liu, Teng & Liu, Dexing & Sun, Fengchun, 2016. "Reinforcement learning-based real-time energy management for a hybrid tracked vehicle," Applied Energy, Elsevier, vol. 171(C), pages 372-382.
- Wu, Peng & Partridge, Julius & Bucknall, Richard, 2020. "Cost-effective reinforcement learning energy management for plug-in hybrid fuel cell and battery ships," Applied Energy, Elsevier, vol. 275(C).
- Wang, Hanchen & Ye, Yiming & Zhang, Jiangfeng & Xu, Bin, 2023. "A comparative study of 13 deep reinforcement learning based energy management methods for a hybrid electric vehicle," Energy, Elsevier, vol. 266(C).
- Badji, Abderrezak & Obeid, Hussein & Hilairet, Mickael & Laghrouche, Salah & Abdeslam, Djaffar Ould & Djerdir, Abdesslem, 2025. "Enhanced energy management of fuel cell electric vehicles using integral sliding mode control and passivity-based control," Applied Energy, Elsevier, vol. 377(PD).
- Du, Guodong & Zou, Yuan & Zhang, Xudong & Guo, Lingxiong & Guo, Ningyuan, 2022. "Energy management for a hybrid electric vehicle based on prioritized deep reinforcement learning framework," Energy, Elsevier, vol. 241(C).
- Dong, Haoxuan & Shi, Junzhe & Zhuang, Weichao & Li, Zhaojian & Song, Ziyou, 2025. "Analyzing the impact of mixed vehicle platoon formations on vehicle energy and traffic efficiencies," Applied Energy, Elsevier, vol. 377(PA).
- Elia Kaufmann & Leonard Bauersfeld & Antonio Loquercio & Matthias Müller & Vladlen Koltun & Davide Scaramuzza, 2023. "Champion-level drone racing using deep reinforcement learning," Nature, Nature, vol. 620(7976), pages 982-987, August.
- Du, Guodong & Zou, Yuan & Zhang, Xudong & Kong, Zehui & Wu, Jinlong & He, Dingbo, 2019. "Intelligent energy management for hybrid electric tracked vehicles using online reinforcement learning," Applied Energy, Elsevier, vol. 251(C), pages 1-1.
- Togun, Hussein & Basem, Ali & Abdulrazzaq, Tuqa & Biswas, Nirmalendu & Abed, Azher M. & dhabab, Jameel M. & Chattopadhyay, Anirban & Slimi, Khalifa & Paul, Dipankar & Barmavatu, Praveen & Chrouda, Ama, 2025. "Development and comparative analysis between battery electric vehicles (BEV) and fuel cell electric vehicles (FCEV)," Applied Energy, Elsevier, vol. 388(C).
- Zhang, Chuntao & Huang, Wenhui & Zhou, Xingyu & Lv, Chen & Sun, Chao, 2024. "Expert-demonstration-augmented reinforcement learning for lane-change-aware eco-driving traversing consecutive traffic lights," Energy, Elsevier, vol. 286(C).
- Huang, Ruchen & He, Hongwen & Zhao, Xuyang & Wang, Yunlong & Li, Menglin, 2022. "Battery health-aware and naturalistic data-driven energy management for hybrid electric bus based on TD3 deep reinforcement learning algorithm," Applied Energy, Elsevier, vol. 321(C).
- Sun, Ziyi & Guo, Rong & Luo, Maohui, 2025. "Integrated energy-thermal management strategy for range extended electric vehicles based on soft actor-critic under low environment temperature," Energy, Elsevier, vol. 330(C).
- Volodymyr Mnih & Koray Kavukcuoglu & David Silver & Andrei A. Rusu & Joel Veness & Marc G. Bellemare & Alex Graves & Martin Riedmiller & Andreas K. Fidjeland & Georg Ostrovski & Stig Petersen & Charle, 2015. "Human-level control through deep reinforcement learning," Nature, Nature, vol. 518(7540), pages 529-533, February.
- Zhao, Yinghua & Huang, Siqi & Wang, Xiaoyu & Shi, Jingwu & Yao, Shouwen, 2024. "Energy management with adaptive moving average filter and deep deterministic policy gradient reinforcement learning for fuel cell hybrid electric vehicles," Energy, Elsevier, vol. 312(C).
- Jinquan, Guo & Hongwen, He & Jianwei, Li & Qingwu, Liu, 2021. "Real-time energy management of fuel cell hybrid electric buses: Fuel cell engines friendly intersection speed planning," Energy, Elsevier, vol. 226(C).
- Chang, Chengcheng & Zhao, Wanzhong & Wang, Chunyan & Luan, Zhongkai, 2023. "An energy management strategy of deep reinforcement learning based on multi-agent architecture under self-generating conditions," Energy, Elsevier, vol. 283(C).
- Liu, Zemin Eitan & Li, Yong & Zhou, Quan & Shuai, Bin & Hua, Min & Xu, Hongming & Xu, Lubing & Tan, Guikun & Li, Yanfei, 2025. "Real-time energy management for HEV combining naturalistic driving data and deep reinforcement learning with high generalization," Applied Energy, Elsevier, vol. 377(PA).
- Zhou, Jianhao & Xue, Siwu & Xue, Yuan & Liao, Yuhui & Liu, Jun & Zhao, Wanzhong, 2021. "A novel energy management strategy of hybrid electric vehicle via an improved TD3 deep reinforcement learning," Energy, Elsevier, vol. 224(C).
Most related items
These are the items that most often cite the same works as this one and are cited by the same works as this one.- Daniel Egan & Qilun Zhu & Robert Prucka, 2023. "A Review of Reinforcement Learning-Based Powertrain Controllers: Effects of Agent Selection for Mixed-Continuity Control and Reward Formulation," Energies, MDPI, vol. 16(8), pages 1-31, April.
- Huang, Ruchen & He, Hongwen & Su, Qicong & Wu, Jingda, 2025. "Towards sustainable and intelligent urban transportation: A novel deep transfer reinforcement learning framework for eco-driving of fuel cell buses," Energy, Elsevier, vol. 330(C).
- Huang, Ruchen & He, Hongwen & Su, Qicong & Härtl, Martin & Jaensch, Malte, 2024. "Enabling cross-type full-knowledge transferable energy management for hybrid electric vehicles via deep transfer reinforcement learning," Energy, Elsevier, vol. 305(C).
- Li, Jianwei & Liu, Jie & Yang, Qingqing & Wang, Tianci & He, Hongwen & Wang, Hanxiao & Sun, Fengchun, 2025. "Reinforcement learning based energy management for fuel cell hybrid electric vehicles: A comprehensive review on decision process reformulation and strategy implementation," Renewable and Sustainable Energy Reviews, Elsevier, vol. 213(C).
- Wang, Hanchen & Ye, Yiming & Zhang, Jiangfeng & Xu, Bin, 2023. "A comparative study of 13 deep reinforcement learning based energy management methods for a hybrid electric vehicle," Energy, Elsevier, vol. 266(C).
- Liu, Yonggang & Wu, Yitao & Wang, Xiangyu & Li, Liang & Zhang, Yuanjian & Chen, Zheng, 2023. "Energy management for hybrid electric vehicles based on imitation reinforcement learning," Energy, Elsevier, vol. 263(PC).
- Huang, Ruchen & He, Hongwen & Su, Qicong & Härtl, Martin & Jaensch, Malte, 2025. "Type- and task-crossing energy management for fuel cell vehicles with longevity consideration: A heterogeneous deep transfer reinforcement learning framework," Applied Energy, Elsevier, vol. 377(PC).
- Nie, Zhigen & Feng, Yaxing & Lian, Yufeng, 2025. "Deep reinforcement learning-based hierarchical control strategy for energy management of intelligent fuel cell hybrid electric vehicles," Energy, Elsevier, vol. 326(C).
- Liu, Zemin Eitan & Li, Yong & Zhou, Quan & Shuai, Bin & Hua, Min & Xu, Hongming & Xu, Lubing & Tan, Guikun & Li, Yanfei, 2025. "Real-time energy management for HEV combining naturalistic driving data and deep reinforcement learning with high generalization," Applied Energy, Elsevier, vol. 377(PA).
- Zhang, Hao & Yang, Guixiang & Lei, Nuo & Chen, Chaoyi & Chen, Boli & Qiu, Lin, 2025. "Scenario-aware electric vehicle energy control with enhanced vehicle-to-grid capability: A multi-task reinforcement learning approach," Energy, Elsevier, vol. 335(C).
- Huang, Ruchen & He, Hongwen & Su, Qicong, 2024. "Smart energy management for hybrid electric bus via improved soft actor-critic algorithm in a heuristic learning framework," Energy, Elsevier, vol. 309(C).
- Mahmud, Sakib & Sayed, Aya Nabil & Himeur, Yassine & Nhlabatsi, Armstrong & Bensaali, Faycal, 2026. "A comprehensive review of deep reinforcement learning applications from centralized power generation to modern energy internet frameworks," Renewable and Sustainable Energy Reviews, Elsevier, vol. 226(PE).
- Fan Wang & Yina Hong & Xiaohuan Zhao, 2025. "Research and Comparative Analysis of Energy Management Strategies for Hybrid Electric Vehicles: A Review," Energies, MDPI, vol. 18(11), pages 1-28, May.
- Yazar, Ozan & Coskun, Serdar & Zhang, Fengqi & Li, Lin & Huang, Cong & Mei, Peng & Karimi, Hamid Reza, 2025. "A novel energy management strategy for hybrid electric vehicles using deep reinforcement incentive learning," Energy, Elsevier, vol. 334(C).
- Niu, Zegong & He, Hongwen, 2024. "A data-driven solution for intelligent power allocation of connected hybrid electric vehicles inspired by offline deep reinforcement learning in V2X scenario," Applied Energy, Elsevier, vol. 372(C).
- Chen, Sihan & Huang, Yin & Zhang, Jie & Yu, Xinshu & Lu, Yifan & Xuan, Dongji, 2025. "Research on a novel multi-agent deep reinforcement learning eco-driving framework," Energy, Elsevier, vol. 326(C).
- Tang, Xiaolin & Zhou, Haitao & Wang, Feng & Wang, Weida & Lin, Xianke, 2022. "Longevity-conscious energy management strategy of fuel cell hybrid electric Vehicle Based on deep reinforcement learning," Energy, Elsevier, vol. 238(PA).
- Li, Xueliang & Liu, Yilong & Yan, Mei & Tian, Dayu & Yang, Shujun & Peng, Zengxiong, 2026. "Data-driven a convergence-enhanced fusion energy management strategy based on teacher agent guidance for hybrid electric vehicles," Applied Energy, Elsevier, vol. 404(C).
- Liu, Hui & You, Congwen & Han, Lijin & Yang, Ningkang & Liu, Baoshuai, 2025. "Off-road hybrid electric vehicle energy management strategy using multi-agent soft actor-critic with collaborative-independent algorithm," Energy, Elsevier, vol. 328(C).
- Li, Menglin & Yin, Long & Yan, Mei & Wu, Jingda & He, Hongwe & Jia, Chunchun, 2024. "Hierarchical intelligent energy-saving control strategy for fuel cell hybrid electric buses based on traffic flow predictions," Energy, Elsevier, vol. 304(C).
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:appene:v:402:y:2025:i:pa:s0306261925016320. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/405891/description#description .
Please note that corrections may take a couple of weeks to filter through the various RePEc services.
Printed from https://ideas.repec.org/a/eee/appene/v402y2025ipas0306261925016320.html