IDEAS home Printed from https://ideas.repec.org/a/eee/appene/v393y2025ics0306261925007871.html

Collaborative optimization strategy of hydrogen fuel cell train energy and thermal management system based on deep reinforcement learning

Author

Listed:
  • Jiang, Kangrui
  • Tian, Zhongbei
  • Wen, Tao
  • Song, Kejian
  • Hillmansen, Stuart
  • Ochieng, Washington Yotto

Abstract

Railway decarbonization has become the main direction of future development of the rail transit industry. Hydrogen fuel cell (HFC) trains have become a competitive potential solution due to their zero carbon emissions and low transformation costs. The high cost of hydrogen, driven by the challenges in storage, transportation, and utilization, remains a major constraint on the commercialization of HFC trains. Temperature has a great impact on the energy conversion efficiency and life of HFC, and its thermal management requirements are more stringent than those of internal combustion engines. Existing HFC train energy management systems (EMS) generally overlook the impact of HFC temperature changes on energy conversion efficiency, and it is difficult to achieve real-time balance control of energy and thermal management according to environmental dynamic conditions. To address this issue, this paper proposes a collaborative optimization energy and thermal management strategy (ETMS) based on deep reinforcement learning (DRL) to minimize hydrogen consumption and control the temperature of the energy supply system near the optimal temperature, while ensuring the dynamic balance of battery charging and discharging. First, a complete physical model of the HFC train is established. Then, the ETMS is modeled as a Markov decision process (MDP), and the agent is trained through an advanced double deep Q-learning algorithm to interact with the real passenger line operation environment to make decisions on the output power of the HFC. Finally, a simulation test was conducted on the Worcester to Hereford line in the West Midlands region of the UK. The results show that within the UK's annual temperature range, the proposed method saves more than 5 % and 2 % of energy compared to the rule-based and GA-based methods, respectively. Additionally, it provides better temperature control and SOC maintenance for the energy supply system.

Suggested Citation

  • Jiang, Kangrui & Tian, Zhongbei & Wen, Tao & Song, Kejian & Hillmansen, Stuart & Ochieng, Washington Yotto, 2025. "Collaborative optimization strategy of hydrogen fuel cell train energy and thermal management system based on deep reinforcement learning," Applied Energy, Elsevier, vol. 393(C).
  • Handle: RePEc:eee:appene:v:393:y:2025:i:c:s0306261925007871
    DOI: 10.1016/j.apenergy.2025.126057
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0306261925007871
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.apenergy.2025.126057?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to

    for a different version of it.

    References listed on IDEAS

    as
    1. Song, Zhen & Pan, Yue & Chen, Huicui & Zhang, Tong, 2021. "Effects of temperature on the performance of fuel cell hybrid electric vehicles: A review," Applied Energy, Elsevier, vol. 302(C).
    2. Wang, Yong & Wu, Yuankai & Tang, Yingjuan & Li, Qin & He, Hongwen, 2023. "Cooperative energy management and eco-driving of plug-in hybrid electric vehicle via multi-agent reinforcement learning," Applied Energy, Elsevier, vol. 332(C).
    3. He, Hongwen & Meng, Xiangfei & Wang, Yong & Khajepour, Amir & An, Xiaowen & Wang, Renguang & Sun, Fengchun, 2024. "Deep reinforcement learning based energy management strategies for electrified vehicles: Recent advances and perspectives," Renewable and Sustainable Energy Reviews, Elsevier, vol. 192(C).
    4. Xu, Jiamin & Zhang, Caizhi & Wan, Zhongmin & Chen, Xi & Chan, Siew Hwa & Tu, Zhengkai, 2022. "Progress and perspectives of integrated thermal management systems in PEM fuel cell vehicles: A review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 155(C).
    5. Peng, Hujun & Li, Jianxiang & Löwenstein, Lars & Hameyer, Kay, 2020. "A scalable, causal, adaptive energy management strategy based on optimal control theory for a fuel cell hybrid railway vehicle," Applied Energy, Elsevier, vol. 267(C).
    6. Hwang, Foo Shen & Confrey, Thomas & Reidy, Colin & Picovici, Dorel & Callaghan, Dean & Culliton, David & Nolan, Cathal, 2024. "Review of battery thermal management systems in electric vehicles," Renewable and Sustainable Energy Reviews, Elsevier, vol. 192(C).
    7. Li, Yuecheng & He, Hongwen & Khajepour, Amir & Wang, Hong & Peng, Jiankun, 2019. "Energy management for a power-split hybrid electric bus via deep reinforcement learning with terrain information," Applied Energy, Elsevier, vol. 255(C).
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Qi Jiang & Shusheng Xiong & Baoquan Sun & Ping Chen & Huipeng Chen & Shaopeng Zhu, 2025. "Research on Energy-Saving Control of Automotive PEMFC Thermal Management System Based on Optimal Operating Temperature Tracking," Energies, MDPI, vol. 18(15), pages 1-25, August.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Lei, Nuo & Zhang, Hao & Hu, Jingjing & Hu, Zunyan & Wang, Zhi, 2025. "Sim-to-real design and development of reinforcement learning-based energy management strategies for fuel cell electric vehicles," Applied Energy, Elsevier, vol. 393(C).
    2. Pei, Yaowang & Chen, Fengxiang & Jiao, Jieran & Ye, Huan & Zhang, Caizhi & Jiang, Xiaojie, 2024. "Fuel cell temperature control based on nonlinear transformation mitigating system nonlinearity," Renewable Energy, Elsevier, vol. 230(C).
    3. Cao, Qiming & Min, Haitao & Sun, Weiyi & Zhao, Honghui & Yu, Yuanbin & Zhang, Zhaopu & Jiang, Junyu, 2024. "A method of combining active and passive strategies by genetic algorithm in multi-stage cold start of proton exchange membrane fuel cell," Energy, Elsevier, vol. 288(C).
    4. Tang, Xingwang & Zhang, Yujia & Xu, Sichuan, 2023. "Experimental study of PEM fuel cell temperature characteristic and corresponding automated optimal temperature calibration model," Energy, Elsevier, vol. 283(C).
    5. Yong Wang & Jingda Wu & Hongwen He & Zhongbao Wei & Fengchun Sun, 2025. "Data-driven energy management for electric vehicles using offline reinforcement learning," Nature Communications, Nature, vol. 16(1), pages 1-16, December.
    6. Liu, Zhaoming & Chang, Guofeng & Yuan, Hao & Tang, Wei & Xie, Jiaping & Wei, Xuezhe & Dai, Haifeng, 2023. "Adaptive look-ahead model predictive control strategy of vehicular PEMFC thermal management," Energy, Elsevier, vol. 285(C).
    7. Zhang, Xiaoqing & Ma, Xiao & Zhang, Zhaohuan & Du, Haoyu & Wu, Zhixuan & Li, Zhe & Shuai, Shijin, 2025. "Review and analysis of thermal management for proton exchange membrane fuel cell hybrid power system," Renewable Energy, Elsevier, vol. 244(C).
    8. Tong, He & Chu, Liang & Zhao, Di & Hou, Zhuoran & Guo, Zhiqi, 2025. "Sustainable energy-speed co-optimization for hybrid electric vehicles in dynamic car-following scenarios via multifunctional deep learning policy," Energy, Elsevier, vol. 334(C).
    9. Wang, Zhong & Zhao, Yahui & Zhang, Yahui & Tian, Yang & Jiao, Xiaohong, 2024. "Safe off-policy reinforcement learning on eco-driving for a P2-P3 hybrid electric truck," Energy, Elsevier, vol. 313(C).
    10. Chen, Ben & Deng, Qihao & Yang, Guanghua & Zhou, Yu & Chen, Wenshang & Cai, Yonghua & Tu, Zhengkai, 2023. "Numerical study on heat transfer characteristics and performance evaluation of PEMFC based on multiphase electrochemical model coupled with cooling channel," Energy, Elsevier, vol. 285(C).
    11. Najmi, Aezid-Ul-Hassan & Wahab, Abdul & Prakash, Rohith & Schopen, Oliver & Esch, Thomas & Shabani, Bahman, 2025. "Thermal management of fuel cell-battery electric vehicles: Challenges and solutions," Applied Energy, Elsevier, vol. 387(C).
    12. Pei, Yaowang & Chen, Fengxiang & Zhou, Su & Huo, Haibo & Ye, Huan, 2025. "Inlet gas flow, pressure and temperature control technology for PEMFC stack testing platforms," Energy, Elsevier, vol. 333(C).
    13. Li, Menglin & Yin, Long & Yan, Mei & Wu, Jingda & He, Hongwe & Jia, Chunchun, 2024. "Hierarchical intelligent energy-saving control strategy for fuel cell hybrid electric buses based on traffic flow predictions," Energy, Elsevier, vol. 304(C).
    14. Yang, Ningkang & Han, Lijin & Xiang, Changle & Liu, Hui & Li, Xunmin, 2021. "An indirect reinforcement learning based real-time energy management strategy via high-order Markov Chain model for a hybrid electric vehicle," Energy, Elsevier, vol. 236(C).
    15. Khoshvaght-Aliabadi, Morteza & Ghodrati, Parvaneh & Kang, Yong Tae, 2025. "Developing a novel battery thermal management system utilizing supercritical CO2 as the cooling medium," Applied Energy, Elsevier, vol. 381(C).
    16. Tang, Tianfeng & Peng, Qianlong & Shi, Qing & Peng, Qingguo & Zhao, Jin & Chen, Chaoyi & Wang, Guangwei, 2024. "Energy management of fuel cell hybrid electric bus in mountainous regions: A deep reinforcement learning approach considering terrain characteristics," Energy, Elsevier, vol. 311(C).
    17. Ma, Yan & Hu, Fuyuan & Hu, Yunfeng, 2023. "Energy efficiency improvement of intelligent fuel cell/battery hybrid vehicles through an integrated management strategy," Energy, Elsevier, vol. 263(PE).
    18. Wu, Chunxia & Sun, Yalong & Tang, Heng & Zhang, Shiwei & Yuan, Wei & Zhu, Likuan & Tang, Yong, 2024. "A review on the liquid cooling thermal management system of lithium-ion batteries," Applied Energy, Elsevier, vol. 375(C).
    19. Daniel Egan & Qilun Zhu & Robert Prucka, 2023. "A Review of Reinforcement Learning-Based Powertrain Controllers: Effects of Agent Selection for Mixed-Continuity Control and Reward Formulation," Energies, MDPI, vol. 16(8), pages 1-31, April.
    20. Ding, Yanyan & Jian, Sisi & Yu, Lin, 2025. "How to reduce carbon emissions in the urban transportation systems through carbon markets? Balancing the monetary and environmental benefits," Applied Energy, Elsevier, vol. 377(PB).

    More about this item

    Keywords

    ;
    ;
    ;
    ;
    ;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:appene:v:393:y:2025:i:c:s0306261925007871. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/405891/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.