IDEAS home Printed from https://ideas.repec.org/a/eee/appene/v397y2025ics0306261925010451.html

Long-term efficient energy management for multi-station collaborative electric vehicle charging: A transformer-based multi-agent reinforcement learning approach

Author

Listed:
  • Song, Ge
  • Xie, Hongbin
  • Zhang, Jingyuan
  • Fu, Hongdi
  • Shi, Zhuoran
  • Feng, Defan
  • Song, Xuan
  • Zhang, Haoran

Abstract

With the rapid adoption and increasing sales of electric vehicles, energy management for electric vehicle charging and maximizing the utilization of green energy have become increasingly critical. Existing studies have demonstrated that reinforcement learning plays a key role in enhancing power dispatch efficiency. However, in complex scenarios involving multiple stations and charging sites, significant challenges remain in leveraging mutual information to capture long-term temporal relationships and addressing the massive state and action spaces. To fill the research gaps in large-scale data, significant long-term temporal dependencies, and communication challenges in multi-station collaborative electric vehicle charging energy management, we propose a transformer-based multi-agent reinforcement learning algorithm, MAHEM. This algorithm leverages the popular transformer model to capture long-term temporal features in sequential data during distributed execution. By utilizing the transformer architecture, our method reduces the complexity of the action space through Q-value decomposition. Different agents effectively communicate through the attention mechanism, while the transformer model efficiently captures long-term temporal information, accelerating training and convergence by predicting future states. Experimental results show that, compared to existing baselines, our method reduces the total charging cost across stations by 31.6 % and achieves optimal performance across various environments, robustness tests, and transfer tests. This highlights the practicality and effectiveness of MAHEM in addressing the challenges of EV energy management systems.

Suggested Citation

  • Song, Ge & Xie, Hongbin & Zhang, Jingyuan & Fu, Hongdi & Shi, Zhuoran & Feng, Defan & Song, Xuan & Zhang, Haoran, 2025. "Long-term efficient energy management for multi-station collaborative electric vehicle charging: A transformer-based multi-agent reinforcement learning approach," Applied Energy, Elsevier, vol. 397(C).
  • Handle: RePEc:eee:appene:v:397:y:2025:i:c:s0306261925010451
    DOI: 10.1016/j.apenergy.2025.126315
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0306261925010451
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.apenergy.2025.126315?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to

    for a different version of it.

    References listed on IDEAS

    as
    1. Wang, Yi & Qiu, Dawei & He, Yinglong & Zhou, Quan & Strbac, Goran, 2023. "Multi-agent reinforcement learning for electric vehicle decarbonized routing and scheduling," Energy, Elsevier, vol. 284(C).
    2. Qiu, Dawei & Wang, Yi & Hua, Weiqi & Strbac, Goran, 2023. "Reinforcement learning for electric vehicle applications in power systems:A critical review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 173(C).
    3. Ignatov, Augustin, 2024. "European highway networks, transportation costs, and regional income," Regional Science and Urban Economics, Elsevier, vol. 104(C).
    4. Zhang, Jing & Yan, Jie & Liu, Yongqian & Zhang, Haoran & Lv, Guoliang, 2020. "Daily electric vehicle charging load profiles considering demographics of vehicle users," Applied Energy, Elsevier, vol. 274(C).
    5. Park, Keonwoo & Moon, Ilkyeong, 2022. "Multi-agent deep reinforcement learning approach for EV charging scheduling in a smart grid," Applied Energy, Elsevier, vol. 328(C).
    6. Elinor Ginzburg-Ganz & Itay Segev & Alexander Balabanov & Elior Segev & Sivan Kaully Naveh & Ram Machlev & Juri Belikov & Liran Katzir & Sarah Keren & Yoash Levron, 2024. "Reinforcement Learning Model-Based and Model-Free Paradigms for Optimal Control Problems in Power Systems: Comprehensive Review and Future Directions," Energies, MDPI, vol. 17(21), pages 1-54, October.
    7. Edmund K Burke & Michel Gendreau & Matthew Hyde & Graham Kendall & Gabriela Ochoa & Ender Özcan & Rong Qu, 2013. "Hyper-heuristics: a survey of the state of the art," Journal of the Operational Research Society, Palgrave Macmillan;The OR Society, vol. 64(12), pages 1695-1724, December.
    8. Mahmud, Khizir & Town, Graham E., 2016. "A review of computer tools for modeling electric vehicle energy requirements and their impact on power distribution networks," Applied Energy, Elsevier, vol. 172(C), pages 337-359.
    9. Van Can Nguyen & Chi-Tai Wang & Ying-Jiun Hsieh, 2021. "Electrification of Highway Transportation with Solar and Wind Energy," Sustainability, MDPI, vol. 13(10), pages 1-28, May.
    10. Chen, Haoqian & Sui, Yi & Shang, Wen-long & Sun, Rencheng & Chen, Zhiheng & Wang, Changying & Han, Chunjia & Zhang, Yuqian & Zhang, Haoran, 2022. "Towards renewable public transport: Mining the performance of electric buses using solar-radiation as an auxiliary power source," Applied Energy, Elsevier, vol. 325(C).
    11. Pegah Alaee & Julius Bems & Amjad Anvari-Moghaddam, 2023. "A Review of the Latest Trends in Technical and Economic Aspects of EV Charging Management," Energies, MDPI, vol. 16(9), pages 1-28, April.
    12. Niphon Kaewdornhan & Chitchai Srithapon & Rittichai Liemthong & Rongrit Chatthaworn, 2023. "Real-Time Multi-Home Energy Management with EV Charging Scheduling Using Multi-Agent Deep Reinforcement Learning Optimization," Energies, MDPI, vol. 16(5), pages 1-25, March.
    13. Vázquez-Canteli, José R. & Nagy, Zoltán, 2019. "Reinforcement learning for demand response: A review of algorithms and modeling techniques," Applied Energy, Elsevier, vol. 235(C), pages 1072-1089.
    14. Zhou, Jianshu & Xiang, Yue & Zhang, Xin & Sun, Zhou & Liu, Xuefei & Liu, Junyong, 2025. "Optimal self-consumption scheduling of highway electric vehicle charging station based on multi-agent deep reinforcement learning," Renewable Energy, Elsevier, vol. 238(C).
    15. Jendoubi, Imen & Bouffard, François, 2023. "Multi-agent hierarchical reinforcement learning for energy management," Applied Energy, Elsevier, vol. 332(C).
    16. Su, Chengguo & Wang, Lingshuang & Sui, Quan & Wu, Huijun, 2025. "Optimal scheduling of a cascade hydro-thermal-wind power system integrating data centers and considering the spatiotemporal asynchronous transfer of energy resources," Applied Energy, Elsevier, vol. 377(PA).
    17. Ruisheng Wang & Zhong Chen & Qiang Xing & Ziqi Zhang & Tian Zhang, 2022. "A Modified Rainbow-Based Deep Reinforcement Learning Method for Optimal Scheduling of Charging Station," Sustainability, MDPI, vol. 14(3), pages 1-14, February.
    18. Yu, Qing & Wang, Zhen & Song, Yancun & Shen, Xinwei & Zhang, Haoran, 2024. "Potential and flexibility analysis of electric taxi fleets V2G system based on trajectory data and agent-based modeling," Applied Energy, Elsevier, vol. 355(C).
    19. Hemmatpour, Mohammad Hasan & Rezaeian Koochi, Mohammad Hossein & Dehghanian, Pooria & Dehghanian, Payman, 2022. "Voltage and energy control in distribution systems in the presence of flexible loads considering coordinated charging of electric vehicles," Energy, Elsevier, vol. 239(PA).
    20. Hegde, Bharatkumar & Ahmed, Qadeer & Rizzoni, Giorgio, 2022. "Energy saving analysis in electrified powertrain using look-ahead energy management scheme," Applied Energy, Elsevier, vol. 325(C).
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Mohammed Alsolami & Ahmad Alferidi & Badr Lami, 2025. "Real-Time Energy Management of a Microgrid Using MPC-DDQN-Controlled V2H and H2V Operations with Renewable Energy Integration," Energies, MDPI, vol. 18(17), pages 1-26, August.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Xie, Hongbin & Song, Ge & Shi, Zhuoran & Peng, Likun & Feng, Defan & Song, Xuan, 2025. "Stable energy management for highway electric vehicle charging based on reinforcement learning," Applied Energy, Elsevier, vol. 389(C).
    2. Kakkar, Riya & Agrawal, Smita & Tanwar, Sudeep, 2024. "A systematic survey on demand response management schemes for electric vehicles," Renewable and Sustainable Energy Reviews, Elsevier, vol. 203(C).
    3. Cheng, Xiu & Li, Wenbo & Yang, Jiameng & Zhang, Linling, 2023. "How convenience and informational tools shape waste separation behavior: A social network approach," Resources Policy, Elsevier, vol. 86(PB).
    4. Xu, Hairun & Zhang, Ao & Wang, Qingle & Hu, Yang & Fang, Fang & Cheng, Long, 2025. "Quantum Reinforcement Learning for real-time optimization in Electric Vehicle charging systems," Applied Energy, Elsevier, vol. 383(C).
    5. Lu, M.L. & Sun, Y.J. & Kokogiannakis, G. & Ma, Z.J., 2024. "Design of flexible energy systems for nearly/net zero energy buildings under uncertainty characteristics: A review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 205(C).
    6. Zhou, Xinlei & Du, Han & Xue, Shan & Ma, Zhenjun, 2024. "Recent advances in data mining and machine learning for enhanced building energy management," Energy, Elsevier, vol. 307(C).
    7. Yang, Meng & Chen, Yue & Huang, Shihan & Chen, Laijun, 2025. "Recent advances in coordination and optimization of power-transportation systems: An overview," Renewable and Sustainable Energy Reviews, Elsevier, vol. 220(C).
    8. Ahmed M. Abed & Ali AlArjani, 2022. "The Neural Network Classifier Works Efficiently on Searching in DQN Using the Autonomous Internet of Things Hybridized by the Metaheuristic Techniques to Reduce the EVs’ Service Scheduling Time," Energies, MDPI, vol. 15(19), pages 1-25, September.
    9. Su, Chutian & Wang, Yi & Strbac, Goran, 2025. "Coordinated electric vehicles dispatch for multi-service provisions: A comprehensive review of modelling and coordination approaches," Renewable and Sustainable Energy Reviews, Elsevier, vol. 223(C).
    10. Panagiotis Michailidis & Iakovos Michailidis & Elias Kosmatopoulos, 2025. "Reinforcement Learning for Electric Vehicle Charging Management: Theory and Applications," Energies, MDPI, vol. 18(19), pages 1-50, October.
    11. Li, Yujing & Zhang, Zhisheng & Xing, Qiang, 2025. "Real-time online charging control of electric vehicle charging station based on a multi-agent deep reinforcement learning," Energy, Elsevier, vol. 319(C).
    12. Garside, Annisa Kesy & Ahmad, Robiah & Muhtazaruddin, Mohd Nabil Bin, 2024. "A recent review of solution approaches for green vehicle routing problem and its variants," Operations Research Perspectives, Elsevier, vol. 12(C).
    13. Zhong, Zhihua & Zhang, Hongzeng & Ozaki, Jun’ichi & Zhou, Yang & Zhao, Xinjie & Dan, Daniel & Wang, Chaofan, 2025. "A comprehensive methodological review of human mobility simulation and modelling: Current trends, challenges, and future directions," Physica A: Statistical Mechanics and its Applications, Elsevier, vol. 674(C).
    14. Muhammad Ikram & Daryoush Habibi & Asma Aziz, 2025. "Networked Multi-Agent Deep Reinforcement Learning Framework for the Provision of Ancillary Services in Hybrid Power Plants," Energies, MDPI, vol. 18(10), pages 1-34, May.
    15. Gharibvand, Hossein & Gharehpetian, G.B. & Anvari-Moghaddam, A., 2024. "A survey on microgrid flexibility resources, evaluation metrics and energy storage effects," Renewable and Sustainable Energy Reviews, Elsevier, vol. 201(C).
    16. Abid, Md. Shadman & Apon, Hasan Jamil & Hossain, Salman & Ahmed, Ashik & Ahshan, Razzaqul & Lipu, M.S. Hossain, 2024. "A novel multi-objective optimization based multi-agent deep reinforcement learning approach for microgrid resources planning," Applied Energy, Elsevier, vol. 353(PA).
    17. Zhao, Zhonghao & Lee, Carman K.M. & Yan, Xiaoyuan & Wang, Haonan, 2024. "Reinforcement learning for electric vehicle charging scheduling: A systematic review," Transportation Research Part E: Logistics and Transportation Review, Elsevier, vol. 190(C).
    18. Qiu, Dawei & Wang, Yi & Hua, Weiqi & Strbac, Goran, 2023. "Reinforcement learning for electric vehicle applications in power systems:A critical review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 173(C).
    19. Kim, Sunwoo & Choi, Yechan & Park, Joungho & Adams, Derrick & Heo, Seongmin & Lee, Jay H., 2024. "Multi-period, multi-timescale stochastic optimization model for simultaneous capacity investment and energy management decisions for hybrid Micro-Grids with green hydrogen production under uncertainty," Renewable and Sustainable Energy Reviews, Elsevier, vol. 190(PA).
    20. Jin, Ruiyang & Zhou, Yuke & Lu, Chao & Song, Jie, 2022. "Deep reinforcement learning-based strategy for charging station participating in demand response," Applied Energy, Elsevier, vol. 328(C).

    More about this item

    Keywords

    ;
    ;
    ;
    ;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:appene:v:397:y:2025:i:c:s0306261925010451. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/405891/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.