IDEAS home Printed from https://ideas.repec.org/a/gam/jeners/v17y2024i12p2821-d1411207.html
   My bibliography  Save this article

Robust Energy Management Policies for Solar Microgrids via Reinforcement Learning

Author

Listed:
  • Gerald Jones

    (Department of Industrial and Systems Engineering, University of Tennessee, Knoxville, TN 37996, USA)

  • Xueping Li

    (Department of Industrial and Systems Engineering, University of Tennessee, Knoxville, TN 37996, USA)

  • Yulin Sun

    (School of Accounting, Southwestern University of Finance and Economics, Chengdu 610074, China)

Abstract

As the integration of renewable energy expands, effective energy system management becomes increasingly crucial. Distributed renewable generation microgrids offer green energy and resilience. Combining them with energy storage and a suitable energy management system (EMS) is essential due to the variability in renewable energy generation. Reinforcement learning (RL)-based EMSs have shown promising results in handling these complexities. However, concerns about policy robustness arise with the growing number of grid intermittent disruptions or disconnections from the main utility. This study investigates the resilience of RL-based EMSs to unforeseen grid disconnections when trained in grid-connected scenarios. Specifically, we evaluate the resilience of policies derived from advantage actor–critic (A2C) and proximal policy optimization (PPO) networks trained in both grid-connected and uncertain grid-connectivity scenarios. Stochastic models, incorporating solar energy and load uncertainties and utilizing real-world data, are employed in the simulation. Our findings indicate that grid-trained PPO and A2C excel in cost coverage, with PPO performing better. However, in isolated or uncertain connectivity scenarios, the demand coverage performance hierarchy shifts. The disruption-trained A2C model achieves the best demand coverage when islanded, whereas the grid-connected A2C network performs best in an uncertain grid connectivity scenario. This study enhances the understanding of the resilience of RL-based solutions using varied training methods and provides an analysis of the EMS policies generated.

Suggested Citation

  • Gerald Jones & Xueping Li & Yulin Sun, 2024. "Robust Energy Management Policies for Solar Microgrids via Reinforcement Learning," Energies, MDPI, vol. 17(12), pages 1-22, June.
  • Handle: RePEc:gam:jeners:v:17:y:2024:i:12:p:2821-:d:1411207
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/1996-1073/17/12/2821/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/1996-1073/17/12/2821/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Zhang, Bin & Hu, Weihao & Xu, Xiao & Li, Tao & Zhang, Zhenyuan & Chen, Zhe, 2022. "Physical-model-free intelligent energy management for a grid-connected hybrid wind-microturbine-PV-EV energy system via deep reinforcement learning approach," Renewable Energy, Elsevier, vol. 200(C), pages 433-448.
    2. Tabar, Vahid Sohrabi & Abbasi, Vahid, 2019. "Energy management in microgrid with considering high penetration of renewable resources and surplus power generation problem," Energy, Elsevier, vol. 189(C).
    3. Guo, Chenyu & Wang, Xin & Zheng, Yihui & Zhang, Feng, 2022. "Real-time optimal energy management of microgrid with uncertainties based on deep reinforcement learning," Energy, Elsevier, vol. 238(PC).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Lu, Yu & Xiang, Yue & Huang, Yuan & Yu, Bin & Weng, Liguo & Liu, Junyong, 2023. "Deep reinforcement learning based optimal scheduling of active distribution system considering distributed generation, energy storage and flexible load," Energy, Elsevier, vol. 271(C).
    2. Wang, Yi & Qiu, Dawei & Sun, Mingyang & Strbac, Goran & Gao, Zhiwei, 2023. "Secure energy management of multi-energy microgrid: A physical-informed safe reinforcement learning approach," Applied Energy, Elsevier, vol. 335(C).
    3. Lan, Penghang & Chen, She & Li, Qihang & Li, Kelin & Wang, Feng & Zhao, Yaoxun, 2024. "Intelligent hydrogen-ammonia combined energy storage system with deep reinforcement learning," Renewable Energy, Elsevier, vol. 237(PB).
    4. Constantino Dário Justo & José Eduardo Tafula & Pedro Moura, 2022. "Planning Sustainable Energy Systems in the Southern African Development Community: A Review of Power Systems Planning Approaches," Energies, MDPI, vol. 15(21), pages 1-28, October.
    5. Dong, Xiao-Jian & Shen, Jia-Ni & Ma, Zi-Feng & He, Yi-Jun, 2025. "Stochastic optimization of integrated electric vehicle charging stations under photovoltaic uncertainty and battery power constraints," Energy, Elsevier, vol. 314(C).
    6. Xiong, Kang & Hu, Weihao & Cao, Di & Li, Sichen & Zhang, Guozhou & Liu, Wen & Huang, Qi & Chen, Zhe, 2023. "Coordinated energy management strategy for multi-energy hub with thermo-electrochemical effect based power-to-ammonia: A multi-agent deep reinforcement learning enabled approach," Renewable Energy, Elsevier, vol. 214(C), pages 216-232.
    7. Zhu, Ziqing & Hu, Ze & Chan, Ka Wing & Bu, Siqi & Zhou, Bin & Xia, Shiwei, 2023. "Reinforcement learning in deregulated energy market: A comprehensive review," Applied Energy, Elsevier, vol. 329(C).
    8. Zhang, Yiwen & Lin, Rui & Mei, Zhen & Lyu, Minghao & Jiang, Huaiguang & Xue, Ying & Zhang, Jun & Gao, David Wenzhong, 2024. "Interior-point policy optimization based multi-agent deep reinforcement learning method for secure home energy management under various uncertainties," Applied Energy, Elsevier, vol. 376(PA).
    9. Pinciroli, Luca & Baraldi, Piero & Compare, Michele & Zio, Enrico, 2023. "Optimal operation and maintenance of energy storage systems in grid-connected microgrids by deep reinforcement learning," Applied Energy, Elsevier, vol. 352(C).
    10. Bio Gassi, Karim & Baysal, Mustafa, 2023. "Improving real-time energy decision-making model with an actor-critic agent in modern microgrids with energy storage devices," Energy, Elsevier, vol. 263(PE).
    11. Zhu, Dafeng & Yang, Bo & Liu, Yuxiang & Wang, Zhaojian & Ma, Kai & Guan, Xinping, 2022. "Energy management based on multi-agent deep reinforcement learning for a multi-energy industrial park," Applied Energy, Elsevier, vol. 311(C).
    12. Wang, Jiawei & Wang, Yi & Qiu, Dawei & Su, Hanguang & Strbac, Goran & Gao, Zhiwei, 2025. "Resilient energy management of a multi-energy building under low-temperature district heating: A deep reinforcement learning approach," Applied Energy, Elsevier, vol. 378(PA).
    13. Jiao, P.H. & Chen, J.J. & Peng, K. & Zhao, Y.L. & Xin, K.F., 2020. "Multi-objective mean-semi-entropy model for optimal standalone micro-grid planning with uncertain renewable energy resources," Energy, Elsevier, vol. 191(C).
    14. Qi, Yunying & Xu, Xiao & Liu, Youbo & Pan, Li & Liu, Junyong & Hu, Weihao, 2024. "Intelligent energy management for an on-grid hydrogen refueling station based on dueling double deep Q network algorithm with NoisyNet," Renewable Energy, Elsevier, vol. 222(C).
    15. Yin, Linfei & Li, Yu, 2022. "Hybrid multi-agent emotional deep Q network for generation control of multi-area integrated energy systems," Applied Energy, Elsevier, vol. 324(C).
    16. Mukhopadhyay, Bineeta & Das, Debapriya, 2021. "Optimal multi-objective expansion planning of a droop-regulated islanded microgrid," Energy, Elsevier, vol. 218(C).
    17. Qu, Kai & Si, Gangquan & Wang, Qianyue & Xu, Minglin & Shan, Zihan, 2025. "Improving economic operation of a microgrid through expert behaviors and prediction intervals," Applied Energy, Elsevier, vol. 383(C).
    18. Wang, Can & Zhang, Jiaheng & Wang, Aoqi & Wang, Zhen & Yang, Nan & Zhao, Zhuoli & Lai, Chun Sing & Lai, Loi Lei, 2024. "Prioritized sum-tree experience replay TD3 DRL-based online energy management of a residential microgrid," Applied Energy, Elsevier, vol. 368(C).
    19. Alireza Gorjian & Mohsen Eskandari & Mohammad H. Moradi, 2023. "Conservation Voltage Reduction in Modern Power Systems: Applications, Implementation, Quantification, and AI-Assisted Techniques," Energies, MDPI, vol. 16(5), pages 1-36, March.
    20. Zhou, Yuekuan & Liu, Xiaohua & Zhao, Qianchuan, 2024. "A stochastic vehicle schedule model for demand response and grid flexibility in a renewable-building-e-transportation-microgrid," Renewable Energy, Elsevier, vol. 221(C).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jeners:v:17:y:2024:i:12:p:2821-:d:1411207. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.