IDEAS home Printed from https://ideas.repec.org/a/eee/energy/v252y2022ics0360544222008799.html
   My bibliography  Save this article

A Q-learning fuzzy inference system based online energy management strategy for off-road hybrid electric vehicles

Author

Listed:
  • Bo, Lin
  • Han, Lijin
  • Xiang, Changle
  • Liu, Hui
  • Ma, Tian

Abstract

In this paper, a Q-learning fuzzy inference system (QLFIS)-based online control architecture is proposed and applied for the optimal control of off-road hybrid electric vehicles (HEVs) to achieve better dynamic performance, fuel economy and real-time performance. A dynamic model, including a hybrid system, vehicle dynamics and road model, is established to obtain the state feedback according to the current driving environment under command. The optimal control strategy and objective function are both constructed by an adaptive network fuzzy inference system (ANFIS) due to its strong approaching ability. The fuzzy rules and parameters are trained online through the Q-learning algorithm and gradient descent method. This control framework provides a new control idea for the control of off-road vehicles. Without knowing the driving cycle in advance, it achieves a good control effect for different driving environments through online data collection and training. The QLFIS-based control strategy is compared to dynamic programming (DP)-based and rule-based strategies based on two different off-road driving cycles through simulation. The simulation results show that the vehicle dynamic performance and fuel economy are improved with respect to the rule-based strategy, while the calculation time is greatly reduced compared to that of the DP-based strategy.

Suggested Citation

  • Bo, Lin & Han, Lijin & Xiang, Changle & Liu, Hui & Ma, Tian, 2022. "A Q-learning fuzzy inference system based online energy management strategy for off-road hybrid electric vehicles," Energy, Elsevier, vol. 252(C).
  • Handle: RePEc:eee:energy:v:252:y:2022:i:c:s0360544222008799
    DOI: 10.1016/j.energy.2022.123976
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0360544222008799
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.energy.2022.123976?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Lin, Xinyou & Wu, Jiayun & Wei, Yimin, 2021. "An ensemble learning velocity prediction-based energy management strategy for a plug-in hybrid electric vehicle considering driving pattern adaptive reference SOC," Energy, Elsevier, vol. 234(C).
    2. Du, Guodong & Zou, Yuan & Zhang, Xudong & Liu, Teng & Wu, Jinlong & He, Dingbo, 2020. "Deep reinforcement learning based energy management for a hybrid electric vehicle," Energy, Elsevier, vol. 201(C).
    3. Song, Ke & Wang, Xiaodi & Li, Feiqiang & Sorrentino, Marco & Zheng, Bailin, 2020. "Pontryagin’s minimum principle-based real-time energy management strategy for fuel cell hybrid electric vehicle considering both fuel economy and power source durability," Energy, Elsevier, vol. 205(C).
    4. Li, Shuangqi & He, Hongwen & Zhao, Pengfei, 2021. "Energy management for hybrid energy storage system in electric vehicle: A cyber-physical system perspective," Energy, Elsevier, vol. 230(C).
    5. Zhao, Chen & Zu, Bingfeng & Xu, Yuliang & Wang, Zhen & Zhou, Jianwei & Liu, Lina, 2020. "Design and analysis of an engine-start control strategy for a single-shaft parallel hybrid electric vehicle," Energy, Elsevier, vol. 202(C).
    6. Xiong, Rui & Duan, Yanzhou & Cao, Jiayi & Yu, Quanqing, 2018. "Battery and ultracapacitor in-the-loop approach to validate a real-time power management method for an all-climate electric vehicle," Applied Energy, Elsevier, vol. 217(C), pages 153-165.
    7. Xie, Shaobo & Lang, Kun & Qi, Shanwei, 2020. "Aerodynamic-aware coordinated control of following speed and power distribution for hybrid electric trucks," Energy, Elsevier, vol. 209(C).
    8. Yang, Ningkang & Han, Lijin & Xiang, Changle & Liu, Hui & Li, Xunmin, 2021. "An indirect reinforcement learning based real-time energy management strategy via high-order Markov Chain model for a hybrid electric vehicle," Energy, Elsevier, vol. 236(C).
    9. He, Hongwen & Xiong, Rui & Zhao, Kai & Liu, Zhentong, 2013. "Energy management strategy research on a hybrid power system by hardware-in-loop experiments," Applied Energy, Elsevier, vol. 112(C), pages 1311-1317.
    10. Wu, Jingda & He, Hongwen & Peng, Jiankun & Li, Yuecheng & Li, Zhanjiang, 2018. "Continuous reinforcement learning of energy management with deep Q network for a power split hybrid electric bus," Applied Energy, Elsevier, vol. 222(C), pages 799-811.
    11. Lu, Xi & Xia, Shiwei & Gu, Wei & Chan, Ka Wing & Shahidehpour, Mohammad, 2021. "Two-stage robust distribution system operation by coordinating electric vehicle aggregator charging and load curtailments," Energy, Elsevier, vol. 226(C).
    12. Xu, Bin & Shi, Junzhe & Li, Sixu & Li, Huayi & Wang, Zhe, 2021. "Energy consumption and battery aging minimization using a Q-learning strategy for a battery/ultracapacitor electric vehicle," Energy, Elsevier, vol. 229(C).
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Tang, Wenbin & Wang, Yaqian & Jiao, Xiaohong & Ren, Lina, 2023. "Hierarchical energy management strategy based on adaptive dynamic programming for hybrid electric vehicles in car-following scenarios," Energy, Elsevier, vol. 265(C).
    2. Mudhafar Al-Saadi & Maher Al-Greer & Michael Short, 2023. "Reinforcement Learning-Based Intelligent Control Strategies for Optimal Power Management in Advanced Power Distribution Systems: A Survey," Energies, MDPI, vol. 16(4), pages 1-38, February.
    3. Ruan, Shumin & Ma, Yue & Yang, Ningkang & Yan, Qi & Xiang, Changle, 2023. "Multiobjective optimization of longitudinal dynamics and energy management for HEVs based on nash bargaining game," Energy, Elsevier, vol. 262(PA).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Xiao, B. & Ruan, J. & Yang, W. & Walker, P.D. & Zhang, N., 2021. "A review of pivotal energy management strategies for extended range electric vehicles," Renewable and Sustainable Energy Reviews, Elsevier, vol. 149(C).
    2. Dong, Peng & Zhao, Junwei & Liu, Xuewu & Wu, Jian & Xu, Xiangyang & Liu, Yanfang & Wang, Shuhan & Guo, Wei, 2022. "Practical application of energy management strategy for hybrid electric vehicles based on intelligent and connected technologies: Development stages, challenges, and future trends," Renewable and Sustainable Energy Reviews, Elsevier, vol. 170(C).
    3. Daniel Egan & Qilun Zhu & Robert Prucka, 2023. "A Review of Reinforcement Learning-Based Powertrain Controllers: Effects of Agent Selection for Mixed-Continuity Control and Reward Formulation," Energies, MDPI, vol. 16(8), pages 1-31, April.
    4. Miranda, Matheus H.R. & Silva, Fabrício L. & Lourenço, Maria A.M. & Eckert, Jony J. & Silva, Ludmila C.A., 2022. "Vehicle drivetrain and fuzzy controller optimization using a planar dynamics simulation based on a real-world driving cycle," Energy, Elsevier, vol. 257(C).
    5. Qi, Chunyang & Zhu, Yiwen & Song, Chuanxue & Yan, Guangfu & Xiao, Feng & Da wang, & Zhang, Xu & Cao, Jingwei & Song, Shixin, 2022. "Hierarchical reinforcement learning based energy management strategy for hybrid electric vehicle," Energy, Elsevier, vol. 238(PA).
    6. Yang, Dongpo & Liu, Tong & Song, Dafeng & Zhang, Xuanming & Zeng, Xiaohua, 2023. "A real time multi-objective optimization Guided-MPC strategy for power-split hybrid electric bus based on velocity prediction," Energy, Elsevier, vol. 276(C).
    7. Zhengyu Yao & Hwan-Sik Yoon & Yang-Ki Hong, 2023. "Control of Hybrid Electric Vehicle Powertrain Using Offline-Online Hybrid Reinforcement Learning," Energies, MDPI, vol. 16(2), pages 1-18, January.
    8. Wang, Yue & Li, Keqiang & Zeng, Xiaohua & Gao, Bolin & Hong, Jichao, 2023. "Investigation of novel intelligent energy management strategies for connected HEB considering global planning of fixed-route information," Energy, Elsevier, vol. 263(PB).
    9. Guo, Xiaokai & Yan, Xianguo & Chen, Zhi & Meng, Zhiyu, 2022. "Research on energy management strategy of heavy-duty fuel cell hybrid vehicles based on dueling-double-deep Q-network," Energy, Elsevier, vol. 260(C).
    10. Kong, Yan & Xu, Nan & Liu, Qiao & Sui, Yan & Yue, Fenglai, 2023. "A data-driven energy management method for parallel PHEVs based on action dependent heuristic dynamic programming (ADHDP) model," Energy, Elsevier, vol. 265(C).
    11. Vázquez-Canteli, José R. & Nagy, Zoltán, 2019. "Reinforcement learning for demand response: A review of algorithms and modeling techniques," Applied Energy, Elsevier, vol. 235(C), pages 1072-1089.
    12. Liu, Bo & Sun, Chao & Wang, Bo & Liang, Weiqiang & Ren, Qiang & Li, Junqiu & Sun, Fengchun, 2022. "Bi-level convex optimization of eco-driving for connected Fuel Cell Hybrid Electric Vehicles through signalized intersections," Energy, Elsevier, vol. 252(C).
    13. Ramya Kuppusamy & Srete Nikolovski & Yuvaraja Teekaraman, 2023. "Review of Machine Learning Techniques for Power Quality Performance Evaluation in Grid-Connected Systems," Sustainability, MDPI, vol. 15(20), pages 1-29, October.
    14. Du, Guodong & Zou, Yuan & Zhang, Xudong & Guo, Lingxiong & Guo, Ningyuan, 2022. "Energy management for a hybrid electric vehicle based on prioritized deep reinforcement learning framework," Energy, Elsevier, vol. 241(C).
    15. Qi, Chunyang & Song, Chuanxue & Xiao, Feng & Song, Shixin, 2022. "Generalization ability of hybrid electric vehicle energy management strategy based on reinforcement learning method," Energy, Elsevier, vol. 250(C).
    16. Liu, Teng & Tan, Wenhao & Tang, Xiaolin & Zhang, Jinwei & Xing, Yang & Cao, Dongpu, 2021. "Driving conditions-driven energy management strategies for hybrid electric vehicles: A review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 151(C).
    17. Tang, Xiaolin & Zhou, Haitao & Wang, Feng & Wang, Weida & Lin, Xianke, 2022. "Longevity-conscious energy management strategy of fuel cell hybrid electric Vehicle Based on deep reinforcement learning," Energy, Elsevier, vol. 238(PA).
    18. Wei, Zhengchao & Ma, Yue & Yang, Ningkang & Ruan, Shumin & Xiang, Changle, 2023. "Reinforcement learning based power management integrating economic rotational speed of turboshaft engine and safety constraints of battery for hybrid electric power system," Energy, Elsevier, vol. 263(PB).
    19. Harri Aaltonen & Seppo Sierla & Rakshith Subramanya & Valeriy Vyatkin, 2021. "A Simulation Environment for Training a Reinforcement Learning Agent Trading a Battery Storage," Energies, MDPI, vol. 14(17), pages 1-20, September.
    20. Sun, Alexander Y., 2020. "Optimal carbon storage reservoir management through deep reinforcement learning," Applied Energy, Elsevier, vol. 278(C).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:energy:v:252:y:2022:i:c:s0360544222008799. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.journals.elsevier.com/energy .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.