IDEAS home Printed from https://ideas.repec.org/a/eee/appene/v381y2025ics0306261924025558.html
   My bibliography  Save this article

Bayesian optimization for hyper-parameter tuning of an improved twin delayed deep deterministic policy gradients based energy management strategy for plug-in hybrid electric vehicles

Author

Listed:
  • Wang, Jinhai
  • Du, Changqing
  • Yan, Fuwu
  • Hua, Min
  • Gongye, Xiangyu
  • Yuan, Quan
  • Xu, Hongming
  • Zhou, Quan

Abstract

Hybridization and electrification of vehicles are underway to achieve Net-zero emissions for road transport. The upcoming deep reinforcement learning (DRL) algorithm shows great promise for the efficient energy management of PHEVs, as it provides the potential to achieve theoretical optimal performance. However, the brittle convergence properties, high sample complexity, and sensitivity to hyper-parameters of DRL algorithms have been major challenges in this field, limiting the applicability of DRL to real-world tasks. A novel EMS for PHEV based on Bayesian Optimization (BO) and improved Twin Delay Deep Deterministic Policy Gradient (TD3) algorithm is proposed in this paper, in which BO is introduced to optimize the TD3 hyper-parameters and a non-parametric reward function (NRF) is designed to improve the TD3 algorithm (BO-NRTD3). The present work addresses two challenges to contribute to the proposed EMS: (1) By hyper-parameter tuning, the TD3 strategy’s brittle convergence and robustness characteristics have been significantly improved; and (2) By designing the non-parametric reward function (NRF), the TD3 strategy can tackle system uncertainties. These findings are validated by comparing with various cutting-edge DRL and DP strategies using Software-in-the-Loop (SiL) and Hardware-in-the-Loop (HiL) tests. The results show that the energy economy of the BO-NRTD3 strategy is up to 98.15% of DP and 4.23% more robust than the parametric reward function TD3 (PR-TD3) strategy.

Suggested Citation

  • Wang, Jinhai & Du, Changqing & Yan, Fuwu & Hua, Min & Gongye, Xiangyu & Yuan, Quan & Xu, Hongming & Zhou, Quan, 2025. "Bayesian optimization for hyper-parameter tuning of an improved twin delayed deep deterministic policy gradients based energy management strategy for plug-in hybrid electric vehicles," Applied Energy, Elsevier, vol. 381(C).
  • Handle: RePEc:eee:appene:v:381:y:2025:i:c:s0306261924025558
    DOI: 10.1016/j.apenergy.2024.125171
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0306261924025558
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.apenergy.2024.125171?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Xie, Shaobo & Hu, Xiaosong & Xin, Zongke & Brighton, James, 2019. "Pontryagin’s Minimum Principle based model predictive control of energy management for a plug-in hybrid electric bus," Applied Energy, Elsevier, vol. 236(C), pages 893-905.
    2. Hua, Min & Zhang, Cetengfei & Zhang, Fanggang & Li, Zhi & Yu, Xiaoli & Xu, Hongming & Zhou, Quan, 2023. "Energy management of multi-mode plug-in hybrid electric vehicle using multi-agent deep reinforcement learning," Applied Energy, Elsevier, vol. 348(C).
    3. Wang, Hanchen & Ye, Yiming & Zhang, Jiangfeng & Xu, Bin, 2023. "A comparative study of 13 deep reinforcement learning based energy management methods for a hybrid electric vehicle," Energy, Elsevier, vol. 266(C).
    4. Chen, Shuang & Hu, Minghui & Guo, Shanqi, 2023. "Fast dynamic-programming algorithm for solving global optimization problems of hybrid electric vehicles," Energy, Elsevier, vol. 273(C).
    5. Fuwu Yan & Jinhai Wang & Changqing Du & Min Hua, 2022. "Multi-Objective Energy Management Strategy for Hybrid Electric Vehicles Based on TD3 with Non-Parametric Reward Function," Energies, MDPI, vol. 16(1), pages 1-17, December.
    6. Sun, Wenjing & Zou, Yuan & Zhang, Xudong & Guo, Ningyuan & Zhang, Bin & Du, Guodong, 2022. "High robustness energy management strategy of hybrid electric vehicle based on improved soft actor-critic deep reinforcement learning," Energy, Elsevier, vol. 258(C).
    7. Wang, Yue & Zeng, Xiaohua & Song, Dafeng & Yang, Nannan, 2019. "Optimal rule design methodology for energy management strategy of a power-split hybrid electric bus," Energy, Elsevier, vol. 185(C), pages 1086-1099.
    8. Hu, Dong & Xie, Hui & Song, Kang & Zhang, Yuanyuan & Yan, Long, 2023. "An apprenticeship-reinforcement learning scheme based on expert demonstrations for energy management strategy of hybrid electric vehicles," Applied Energy, Elsevier, vol. 342(C).
    9. Zhang, Zhen & Zhang, Tiezhu & Hong, Jichao & Zhang, Hongxin & Yang, Jian & Jia, Qingxiao, 2023. "Double deep Q-network guided energy management strategy of a novel electric-hydraulic hybrid electric vehicle," Energy, Elsevier, vol. 269(C).
    10. Zhou, Jianhao & Xue, Siwu & Xue, Yuan & Liao, Yuhui & Liu, Jun & Zhao, Wanzhong, 2021. "A novel energy management strategy of hybrid electric vehicle via an improved TD3 deep reinforcement learning," Energy, Elsevier, vol. 224(C).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Zhang, Hao & Lei, Nuo & Chen, Boli & Li, Bingbing & Li, Rulong & Wang, Zhi, 2024. "Modeling and control system optimization for electrified vehicles: A data-driven approach," Energy, Elsevier, vol. 310(C).
    2. Zhou, Jie & Zhang, Tiezhu & Zhang, Hongxin & Zhang, Zhen & Hong, Jichao & Yang, Jian, 2024. "Energy management strategy for electro-hydraulic hybrid electric vehicles considering optimal mode switching: A soft actor-critic approach trained on a multi-modal driving cycle," Energy, Elsevier, vol. 305(C).
    3. Liu, Yonggang & Wu, Yitao & Wang, Xiangyu & Li, Liang & Zhang, Yuanjian & Chen, Zheng, 2023. "Energy management for hybrid electric vehicles based on imitation reinforcement learning," Energy, Elsevier, vol. 263(PC).
    4. Sun, Wenjing & Zou, Yuan & Zhang, Xudong & Guo, Ningyuan & Zhang, Bin & Du, Guodong, 2022. "High robustness energy management strategy of hybrid electric vehicle based on improved soft actor-critic deep reinforcement learning," Energy, Elsevier, vol. 258(C).
    5. Hu, Dong & Huang, Chao & Yin, Guodong & Li, Yangmin & Huang, Yue & Huang, Hailong & Wu, Jingda & Li, Wenfei & Xie, Hui, 2024. "A transfer-based reinforcement learning collaborative energy management strategy for extended-range electric buses with cabin temperature comfort consideration," Energy, Elsevier, vol. 290(C).
    6. Shi, Dehua & Xu, Han & Wang, Shaohua & Hu, Jia & Chen, Long & Yin, Chunfang, 2024. "Deep reinforcement learning based adaptive energy management for plug-in hybrid electric vehicle with double deep Q-network," Energy, Elsevier, vol. 305(C).
    7. Liu, Weirong & Yao, Pengfei & Wu, Yue & Duan, Lijun & Li, Heng & Peng, Jun, 2025. "Imitation reinforcement learning energy management for electric vehicles with hybrid energy storage system," Applied Energy, Elsevier, vol. 378(PA).
    8. Zhou, Jianhao & Xue, Yuan & Xu, Da & Li, Chaoxiong & Zhao, Wanzhong, 2022. "Self-learning energy management strategy for hybrid electric vehicle via curiosity-inspired asynchronous deep reinforcement learning," Energy, Elsevier, vol. 242(C).
    9. Dong, Peng & Zhao, Junwei & Liu, Xuewu & Wu, Jian & Xu, Xiangyang & Liu, Yanfang & Wang, Shuhan & Guo, Wei, 2022. "Practical application of energy management strategy for hybrid electric vehicles based on intelligent and connected technologies: Development stages, challenges, and future trends," Renewable and Sustainable Energy Reviews, Elsevier, vol. 170(C).
    10. Zou, Yunge & Yang, Yalian & Zhang, Yuxin & Liu, Changdong, 2024. "Computationally efficient assessment of fuel economy of multi-modes and multi-gears hybrid electric vehicles: A hyper rapid dynamic programming approach," Energy, Elsevier, vol. 313(C).
    11. Kunyu Wang & Rong Yang & Yongjian Zhou & Wei Huang & Song Zhang, 2022. "Design and Improvement of SD3-Based Energy Management Strategy for a Hybrid Electric Urban Bus," Energies, MDPI, vol. 15(16), pages 1-21, August.
    12. Juan Carlos Paredes-Rojas & Ramón Costa-Castelló & Rubén Vázquez-Medina & Juan Alejandro Flores-Campos & Christopher Rene Torres-San Miguel, 2025. "Experimental Study on Using Biodiesel in Hybrid Electric Vehicles," Energies, MDPI, vol. 18(7), pages 1-22, March.
    13. Alessia Musa & Pier Giuseppe Anselma & Giovanni Belingardi & Daniela Anna Misul, 2023. "Energy Management in Hybrid Electric Vehicles: A Q-Learning Solution for Enhanced Drivability and Energy Efficiency," Energies, MDPI, vol. 17(1), pages 1-20, December.
    14. Wang, Hanchen & Arjmandzadeh, Ziba & Ye, Yiming & Zhang, Jiangfeng & Xu, Bin, 2024. "FlexNet: A warm start method for deep reinforcement learning in hybrid electric vehicle energy management applications," Energy, Elsevier, vol. 288(C).
    15. Hu, Dong & Huang, Chao & Wu, Jingda & Wei, Henglai & Pi, Dawei, 2025. "Enhancing data-driven energy management strategy via digital expert guidance for electrified vehicles," Applied Energy, Elsevier, vol. 381(C).
    16. Gao, Sichen & Zong, Yuhua & Ju, Fei & Wang, Qun & Huo, Weiwei & Wang, Liangmo & Wang, Tao, 2024. "Scenario-oriented adaptive ECMS using speed prediction for fuel cell vehicles in real-world driving," Energy, Elsevier, vol. 304(C).
    17. Chen, Fujun & Wang, Bowen & Ni, Meng & Gong, Zhichao & Jiao, Kui, 2024. "Online energy management strategy for ammonia-hydrogen hybrid electric vehicles harnessing deep reinforcement learning," Energy, Elsevier, vol. 301(C).
    18. Tan, Yingqi & Xu, Jingyi & Ma, Junyi & Li, Zirui & Chen, Huiyan & Xi, Junqiang & Liu, Haiou, 2024. "A transferable perception-guided EMS for series hybrid electric unmanned tracked vehicles," Energy, Elsevier, vol. 306(C).
    19. Liu, Zemin Eitan & Li, Yong & Zhou, Quan & Shuai, Bin & Hua, Min & Xu, Hongming & Xu, Lubing & Tan, Guikun & Li, Yanfei, 2025. "Real-time energy management for HEV combining naturalistic driving data and deep reinforcement learning with high generalization," Applied Energy, Elsevier, vol. 377(PA).
    20. Wang, Yue & Li, Keqiang & Zeng, Xiaohua & Gao, Bolin & Hong, Jichao, 2023. "Investigation of novel intelligent energy management strategies for connected HEB considering global planning of fixed-route information," Energy, Elsevier, vol. 263(PB).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:appene:v:381:y:2025:i:c:s0306261924025558. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/405891/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.