IDEAS home Printed from https://ideas.repec.org/a/eee/energy/v319y2025ics0360544225007376.html
   My bibliography  Save this article

Real-time online charging control of electric vehicle charging station based on a multi-agent deep reinforcement learning

Author

Listed:
  • Li, Yujing
  • Zhang, Zhisheng
  • Xing, Qiang

Abstract

This paper proposes a multi-agent deep reinforcement learning-based charging scheduling strategy for electric vehicle (EV) charging stations, aiming to solve the problem of real-time online charging control of multiple EVs within a single charging station in an uncertain charging environment with random EV arrivals and departures. The proposed approach endeavors to maximize the benefits of EV drivers and charging station operators. First, a coordinated control framework for EV charging in the coupled transportation electrification system is constructed, and the Markov decision process is leveraged to describe the charging scheduling process of a single EV. The charging scheduling objective considers the charging station revenues, the overload penalty of charging station, EV drivers' charging comfort in the charging area, insufficient charging penalty in the charging area, and the waiting penalty in the waiting area. Second, a multi-agent deep reinforcement learning algorithm based on the centralized training with decentralized execution framework is developed. The algorithm utilizes an attention network to interact with the agents' observations and embeds an action mask layer to filter invalid actions. The charger serves as an agent that makes action decisions about charging power at each time slot. Finally, we utilize actual charging station operating data in Xi'an, China, to validate the effectiveness of the proposed approach in improving the overall benefits of charging stations and the scalability of the algorithm.

Suggested Citation

  • Li, Yujing & Zhang, Zhisheng & Xing, Qiang, 2025. "Real-time online charging control of electric vehicle charging station based on a multi-agent deep reinforcement learning," Energy, Elsevier, vol. 319(C).
  • Handle: RePEc:eee:energy:v:319:y:2025:i:c:s0360544225007376
    DOI: 10.1016/j.energy.2025.135095
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0360544225007376
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.energy.2025.135095?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to

    for a different version of it.

    References listed on IDEAS

    as
    1. Li, Xiaohui & Wang, Zhenpo & Zhang, Lei & Sun, Fengchun & Cui, Dingsong & Hecht, Christopher & Figgener, Jan & Sauer, Dirk Uwe, 2023. "Electric vehicle behavior modeling and applications in vehicle-grid integration: An overview," Energy, Elsevier, vol. 268(C).
    2. Zhou, Kaile & Cheng, Lexin & Lu, Xinhui & Wen, Lulu, 2020. "Scheduling model of electric vehicles charging considering inconvenience and dynamic electricity prices," Applied Energy, Elsevier, vol. 276(C).
    3. Andrew W Thompson & Yannick Perez, 2019. "Vehicle-to-Anything (V2X) Energy Services, Value Streams, and Regulatory Policy Implications," Working Papers hal-02265826, HAL.
    4. Qiu, Dawei & Wang, Yi & Hua, Weiqi & Strbac, Goran, 2023. "Reinforcement learning for electric vehicle applications in power systems:A critical review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 173(C).
    5. Chen, Jiahui & Wang, Fang & He, Xiaoyi & Liang, Xinyu & Huang, Junling & Zhang, Shaojun & Wu, Ye, 2022. "Emission mitigation potential from coordinated charging schemes for future private electric vehicles," Applied Energy, Elsevier, vol. 308(C).
    6. Li, Zhikang & Ma, Chengbin, 2022. "A temporal–spatial charging coordination scheme incorporating probability of EV charging availability," Applied Energy, Elsevier, vol. 325(C).
    7. Park, Keonwoo & Moon, Ilkyeong, 2022. "Multi-agent deep reinforcement learning approach for EV charging scheduling in a smart grid," Applied Energy, Elsevier, vol. 328(C).
    8. Lai, Chun Sing & Chen, Dashen & Zhang, Jinning & Zhang, Xin & Xu, Xu & Taylor, Gareth A. & Lai, Loi Lei, 2022. "Profit maximization for large-scale energy storage systems to enable fast EV charging infrastructure in distribution networks," Energy, Elsevier, vol. 259(C).
    9. Tuchnitz, Felix & Ebell, Niklas & Schlund, Jonas & Pruckner, Marco, 2021. "Development and Evaluation of a Smart Charging Strategy for an Electric Vehicle Fleet Based on Reinforcement Learning," Applied Energy, Elsevier, vol. 285(C).
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Juan Zhan & Mei Huang & Xiaojia Sun & Zuowei Chen & Zhihan Zhang & Yang Li & Yubo Zhang & Qian Ai, 2025. "Coordinated Interaction Strategy of User-Side EV Charging Piles for Distribution Network Power Stability," Energies, MDPI, vol. 18(8), pages 1-22, April.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Zhao, Zhonghao & Lee, Carman K.M. & Ren, Jingzheng, 2024. "A two-level charging scheduling method for public electric vehicle charging stations considering heterogeneous demand and nonlinear charging profile," Applied Energy, Elsevier, vol. 355(C).
    2. Abid, Md. Shadman & Apon, Hasan Jamil & Hossain, Salman & Ahmed, Ashik & Ahshan, Razzaqul & Lipu, M.S. Hossain, 2024. "A novel multi-objective optimization based multi-agent deep reinforcement learning approach for microgrid resources planning," Applied Energy, Elsevier, vol. 353(PA).
    3. Zhao, Zhonghao & Lee, Carman K.M. & Huo, Jiage, 2023. "EV charging station deployment on coupled transportation and power distribution networks via reinforcement learning," Energy, Elsevier, vol. 267(C).
    4. Xu, Hairun & Zhang, Ao & Wang, Qingle & Hu, Yang & Fang, Fang & Cheng, Long, 2025. "Quantum Reinforcement Learning for real-time optimization in Electric Vehicle charging systems," Applied Energy, Elsevier, vol. 383(C).
    5. Paudel, Diwas & Das, Tapas K., 2023. "A deep reinforcement learning approach for power management of battery-assisted fast-charging EV hubs participating in day-ahead and real-time electricity markets," Energy, Elsevier, vol. 283(C).
    6. Li, Xiaohui & Wang, Zhenpo & Zhang, Lei & Sun, Fengchun & Cui, Dingsong & Hecht, Christopher & Figgener, Jan & Sauer, Dirk Uwe, 2023. "Electric vehicle behavior modeling and applications in vehicle-grid integration: An overview," Energy, Elsevier, vol. 268(C).
    7. Zhao, Zhonghao & Lee, Carman K.M. & Yan, Xiaoyuan & Wang, Haonan, 2024. "Reinforcement learning for electric vehicle charging scheduling: A systematic review," Transportation Research Part E: Logistics and Transportation Review, Elsevier, vol. 190(C).
    8. Pegah Alaee & Julius Bems & Amjad Anvari-Moghaddam, 2023. "A Review of the Latest Trends in Technical and Economic Aspects of EV Charging Management," Energies, MDPI, vol. 16(9), pages 1-28, April.
    9. Tepe, Benedikt & Figgener, Jan & Englberger, Stefan & Sauer, Dirk Uwe & Jossen, Andreas & Hesse, Holger, 2022. "Optimal pool composition of commercial electric vehicles in V2G fleet operation of various electricity markets," Applied Energy, Elsevier, vol. 308(C).
    10. Chen, Guibin & Yang, Lun & Cao, Xiaoyu, 2025. "A deep reinforcement learning-based charging scheduling approach with augmented Lagrangian for electric vehicles," Applied Energy, Elsevier, vol. 378(PA).
    11. Jin, Ruiyang & Zhou, Yuke & Lu, Chao & Song, Jie, 2022. "Deep reinforcement learning-based strategy for charging station participating in demand response," Applied Energy, Elsevier, vol. 328(C).
    12. Harasis, Salman & Khan, Irfan & Massoud, Ahmed, 2024. "Enabling large-scale integration of electric bus fleets in harsh environments: Possibilities, potentials, and challenges," Energy, Elsevier, vol. 300(C).
    13. Fu, Zhi & Liu, Xiaochen & Zhang, Ji & Zhang, Tao & Liu, Xiaohua & Jiang, Yi, 2025. "Orderly solar charging of electric vehicles and its impact on charging behavior: A year-round field experiment," Applied Energy, Elsevier, vol. 381(C).
    14. Yin, Rumeng & He, Jiang, 2023. "Design of a photovoltaic electric bike battery-sharing system in public transit stations," Applied Energy, Elsevier, vol. 332(C).
    15. Zhao, Yang & Jiang, Ziyue & Chen, Xinyu & Liu, Peng & Peng, Tianduo & Shu, Zhan, 2023. "Toward environmental sustainability: data-driven analysis of energy use patterns and load profiles for urban electric vehicle fleets," Energy, Elsevier, vol. 285(C).
    16. Samuel M. Muhindo & Roland P. Malhamé & Geza Joos, 2021. "A Novel Mean Field Game-Based Strategy for Charging Electric Vehicles in Solar Powered Parking Lots," Energies, MDPI, vol. 14(24), pages 1-21, December.
    17. Quintero Fuentes, Abel & Hickman, Mark & Whitehead, Jake, 2025. "Zone substations' readiness to embrace electric vehicle adoption: Brisbane case study," Energy, Elsevier, vol. 322(C).
    18. Powell, Siobhan & Martin, Sonia & Rajagopal, Ram & Azevedo, Inês M.L. & de Chalendar, Jacques, 2024. "Future-proof rates for controlled electric vehicle charging: Comparing multi-year impacts of different emission factor signals," Energy Policy, Elsevier, vol. 190(C).
    19. Phillip K. Agbesi & Rico Ruffino & Marko Hakovirta, 2024. "Creating an optimal electric vehicle ecosystem: an investigation of electric vehicle stakeholders and ecosystem trends in the US," SN Business & Economics, Springer, vol. 4(3), pages 1-36, March.
    20. Cheng, Xiu & Li, Wenbo & Yang, Jiameng & Zhang, Linling, 2023. "How convenience and informational tools shape waste separation behavior: A social network approach," Resources Policy, Elsevier, vol. 86(PB).

    More about this item

    Keywords

    ;
    ;
    ;
    ;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:energy:v:319:y:2025:i:c:s0360544225007376. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.journals.elsevier.com/energy .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.